The course will assume basic knowledge of class XII algebra and a familiarity with calculus. Even though, the course will start with defining matrices and operations associated with it. This will lead to the study of system of linear equations, elementary matrices, invertible matrices, the row-reduced echelon form of a matrix and a few equivalent conditions for a square matrix to be invertible. From here, we will go into the axiomatic definition of vector spaces over real and complex numbers, try to understand linear combination, linear span, linear independence and linear dependence and hopefully understand the basis of a finite dimensional vector space. We will then go into functions from one vector space to another, commonly known as linear transformations. For the finite dimensional case, we will see that all functions can be understood through matrices and vice-versa. We will then define inner/dot product in a vector space. This leads to the understanding of length of a vector and orthogonality between vectors. As our main result in this part, we will understand the Gram-Schmidt orthogonalization process. Finally, we will go into the topic of eigenvalues and eigenvectors associated with a square matrices or linear operators. As a final result, we will learn the spectral theorem for Hermitian/Self-adjoint matrices. As an application, we will classify the quadrics.
INTENDED AUDIENCE :Mathematics Honours
PREREQUISITES : Class XII algebra and calculusINDUSTRIES SUPPORT :None
COURSE LAYOUT Week 1:Introduction to matrices, matrix operations, such as addition of matrices, transpose of a matrix, scalar multiplication and matrix multiplication etc. Invertible matrices, examples and submatrix of a matrixWeek 2:System of linear equations, elementary row operations, elementary matrices, Gauss elimination and Gauss – Jordan methods, LU decomposition, Row-reduced echelon form of a matrix, Rank of a matrix and the solution set of a linear systemWeek 3:Application of the solution set of a linear system to linear systems where the coefficient matrix is a square matrix. Determinant of a matrix, Inverse using the classical adjoint method and the Cramer’s rule.Week 4:Axiomatic definition of a vector space, examples, subspaces and linear combination and linear span, finite dimensional vector space, fundamental subspaces associated with a matrix
Week 5:Linear independence and dependence, linear independence and the rank of a matrix, basis of a vector space, constructing a basis of a finite dimensional vector spaceWeek 6:Linear transformations, rank-nullity theorem and its application to maps between finite dimensional vector spaces.Week 7:Ordered bases, matrix of a linear transformation and similarity of matricesWeek 8:Dot/Inner product in a vector space, Cauchy Schwartz inequality, angle between two vectors, Projection of a vector onto another vector
Week 9:Gram-Schmidt orthogonalization process and the QR- decomposition, least square solution of a non-consistent linear system and the orthogonal projections,Week 10:Motivation, definition and examples for eigenvalues and eigenvectors, Schur’s unitary triangularization,Week 11:Diagonalizable matrices, Criteria for diagonalizability, diagonalizability of Normal matrices, Spectral theorem for Hermitian matricesWeek 12:Quadratic forms, Sylvester’s law of inertia, Classification of quadrics