$$\color{Blue} {\textbf{Linear Algebra}}$$
$\textbf{1.Properties of determinants:}$
The determinant is only valid for the square matrix.
 $\mid A^{T}\mid = \mid A \mid $
 $\mid AB \mid = \mid A \mid \mid B \mid $
 $\mid A^{n} \mid = \big(\mid A \mid\big)^{n}$
 $\mid kA\mid = k^{n} \mid A \mid$, here $A$ is the $n\times n$ matrix.
 If two rows (or two columns) of a determinant are interchanged, the sign of the value of the determinant changes.
 If in determinant any row or column is completely zero, the value of the determinant is zero.
 If two rows (or two columns) of a determinant are identical, the value of the determinant is zero.
$\textbf{2.Matrix Multiplication:}$
It is valid for both square and nonsquare matrix.
Let $\mathbf{A_{m\times n}}$ and $\mathbf{B_{n\times p}}$ are two matrices then, the resultant matrix is $\mathbf{(AB)_{m\times p}}$, has
 Number of elements $=mp$
 Number of multiplication $ = (mp)n = mnp$
 Number of addition $ = mp(n1)$
_________________________________________________________________
$\color{Red} { \textbf{Key Points:}}$
 $(Adj\: A)A = A(Adj\:A) = \mid A \mid I_{n}$
 $Adj(AB) = (Adj\:B)\cdot (Adj\: A)$
 $(AB)^{1} = B^{1}\cdot A^{1}$
 $(AB)^{T} = B^{T}\cdot A^{T}$
 $(A^{T})^{1} = (A^{1})^{T}$
 $A\cdot A^{1} = A^{1} \cdot A = I$
 $Adj(Adj\:A) = \mid A\mid ^{n2}\cdot A$
 $\mid Adj\: A \mid = \mid A \mid ^{n1}$
 $\mid Adj(Adj\: A) \mid = \mid A \mid ^{{(n1)}^{2}}$
 $Adj(A^{m}) = (Adj\:A)^{m}$
 $Adj(kA) = k^{n1}(Adj \:A),k\in \mathbb{R}$
_________________________________________________________________
${\color{Magenta}{\textbf{Some More Points:}} }$
 Minimum number of zeros in a diagonal matrix of order $n$ is $n(n1).$
 $AB = \text{diag}(a_{1},a_{2},a_{3})\times \text{diag}(b_{1},b_{2},b_{3}) = \text{diag}(a_{1}b_{1},a_{2}b_{2},a_{3}b_{3})$
 For diagonal and triangular matrix (upper triangular or lower triangular) the determinant is equal to product of leading diagonal elements.
 The matrix which is both symmetric and skewsymmetric must be a null matrix.
 All the diagonal elements of the Skew Hermitian matrix are either zero or pure imaginary.
 All the diagonal elements of the Hermitian matrix are real.
 The determinant of Idempotent matrix is either $0$ or $1.$
 Determinant and Trace of the nilpotent matrix is zero.
 The inverse of the nilpotent matrix does not exist.

$\color{green}{\checkmark}$ A square matrix whose all eigenvalues are zero is a nilpotent matrix
$\color{green}{\checkmark}$ In linear algebra, a nilpotent matrix is a square matrix $A$ such that ${\displaystyle A^{k}=0\,}$ for some positive integer ${\displaystyle k}.$ The smallest such ${\displaystyle k}$ is sometimes called the index of ${\displaystyle A}$
Example$:$ the matrix $A = \begin{bmatrix} 0& 0\\1 &0 \end{bmatrix}$ is nilpotent with index $2,$since $A^{2} = 0$
___________________________________________________________________
${\color{Orange}{\textbf{Properties of Eigen Values:}} }$
 The sum of eigen values of a matrix is equal to the trace of the matrix, where the sum of the elements of principal diagonal of a matrix is called the trace of the matrix. $$\sum_{i=1}^{n}(\lambda_{i}) = \lambda_{1} + \lambda_{2} + \lambda_{3} + \dots \lambda_{n} = \text{Trace of the matrix}$$
 The product of eigen values of a matrix $A$ is equal to the determinant of matrix $A.$ $$\prod_{i=1}^{n} = \lambda_{1} \cdot \lambda_{2} \cdot \lambda_{3} \dots \lambda_{n} = \mid A \mid $$
 For Hermitian matrix every eigen value is real.
 The eigenvalues of a skewHermitian matrix are all purely imaginary or zero.
 Every eigenvalue of a Unitary matrix has absolute value. i.e. $ \mid \lambda \mid = 1$
 Any square matrix $A$ and its transpose $A^{T}$ have same eigenvalues.
 If $\lambda_{1},\lambda_{2},\lambda_{3},\dots ,\lambda_{n}$ are eigenvalues of matrix $A$, then eigenvalues of
$\color{green}\checkmark\: kA$ are $k\lambda_{1},k\lambda_{2},k\lambda_{3},\dots, k\lambda_{n}$
$\color{green}\checkmark\:A^{m}$ are $\lambda_{1}^{m},\lambda_{2}^{m},\lambda_{3}^{m},\dots ,\lambda_{n}^{m}$
$\color{green}\checkmark\:A^{1}$ are $\frac{1}{\lambda_{1}},\frac{1}{\lambda_{2}},\frac{1}{\lambda_{3}},\dots ,\frac{1}{\lambda_{n}}$
$\color{green}\checkmark\:A+kI$ are $\lambda_{1}+k,\lambda_{2}+k,\lambda_{3}+k,\dots ,\lambda_{n}+k$
 If $\lambda$ is an eigenvalue of an Orthogonal matrix $A$, then $\dfrac{1}{\lambda}$ is also an eigenvalue of matrix $A(A^{T} = A^{1})$
 The eigenvalue of a symmetric matrix are purely real.
 The eigenvalue of a skewsymmetric matrix is either purely imaginary or zero.
 Zero is an eigenvalue of a matrix iff matrix is singular.
 If all the eigenvalues are distinct then the corresponding eigenvectors are independent.
 The set of eigenvalues is called the spectrum of $A$ and the largest eigenvalue in magnitude is called the spectral radius of $A.$ Where $A$ is the given matrix.
${\color{Orchid}{\textbf{Properties of Eigen Vectors:}} }$
 For every eigenvalue there exist atleast one eigenvectors.
 If $\lambda$ is an eigenvalue of a matrix $A,$ then the corresponding eigenvector $X$ is not unique.i.e., we have infinite number of eigenvector corresponding to a single eigenvalue.
 If $\lambda_{1},\lambda_{2},\lambda_{3},\dots ,\lambda_{n}$ be distinct eigen values of a $n\times n$ matrix, then corresponding eigen vectors $=X_{1},X_{2},X_{3},\dots,X_{n}$ form a linearly independent set.
 If two or more eigenvalues are equal then eigenvectors are linearly dependent.
 Two eigenvectors $X_{1}$ and $X_{2}$ are called orthogonal vectors if $X_{1}^{T}X_{2}=0.$
 $\textbf{Normalized eigenvectors:}$ A normalized eigenvector is an eigen vector of length one. Consider an eigen vector $X = \begin{bmatrix}a \\ b\end{bmatrix}_{2\times 1}$, then length of this eigen vector is $\left \ X \right \ = \sqrt {a^{2} + b^{2}}.$ Normalized eigenvector is $\hat{X} = \dfrac{X}{\left \ X \right \}=\dfrac{\text{Eigen vector}}{\text{Length of eigen vector}} =\begin{bmatrix}\dfrac{a}{\sqrt{a^{2} + b^{2}}} \\ \dfrac{b}{\sqrt{a^{2} + b^{2}}} \end{bmatrix}_{2\times 1}$
$\color{green}\checkmark\:$Length of the normalized eigenvector is always unity.
$\textbf{3.Rank of the Matrix:}$
The rank of a matrix $A$ is the maximum number of linearly independent rows or columns. A matrix is full rank matrix, if all the rows and columns are linearly independent. Rank of the matrix $A$ is denoted by $\rho{(A)}.$
${\color{Purple}{\textbf{Properties of rank of matrix:}} }$
 The rank of the matrix does not change by elementary transformation, we can calculate the rank by elementary transformation by changing the matrix into echelon form. In echelon form, the rank of matrix is number of nonzero row of matrix.
 The rank of the matrix is zero, only when the matrix is a null matrix.
 $\rho(A)\leq \text{min(row, column)}$
 $\rho(AB)\leq \text{min}[\rho(A), \rho(B)]$
 $\rho(A^{T}A) = \rho(AA^{T}) = \rho(A) = \rho(A^{T})$
 If $A$ and $B$ are matrices of same order, then $\rho(A+B)\leq \rho(A) + \rho(B)$ and $\rho(AB)\geq \rho(A)rho(B)$
 If $A^{\theta}$ is the conjugate transpose of $A,$ then $\rho(A^{\theta})= \rho(AA^{\theta}) = \rho(A^{\theta}A) = \rho(A) $
 The rank of the skewsymmetric matrix cannot be one.
 If $A$ and $B$ are two $n$rowed square matrices, then $\rho(AB)\geq \rho(A) + \rho(B) – n$
$\textbf{4.Solution of Linear Simultaneous Equations:}$
There are two types of linear simultaneous equations:
 Linear homogeneous equation$:AX = 0$
 Linear nonhomogeneous equation$:AX = B$
Steps to investigate the consistency of system of linear equations.
 First represent the equation in the matrix form as $AX = B$
 System equation $AX = B$ is checked for consistency as to make Augmented matrix $[A:B].$
$\textbf{Augmented Matrix}\: \mathbf{[A:B]:}$
 $\rho(A)\neq \rho([A:B])$ inconsistent $\color{Red} {\textbf{(No solution)}}$
 $\rho(A) = \rho([A:B])$ consistent $\color{green} {\textbf{(Always have a solution)}}$
$\color{green}\checkmark\:\rho(A) = \rho([A:B]) = \text{Number of unknown variables}\:\: \color{Cyan} {\textbf{(Unique solution)}}$
$\color{green}\checkmark\:\rho(A) = \rho([A:B]) < \text{Number of unknown variables}\:\: \color{Salmon} {\textbf{(Infinite solution)}}$
______________________________________________________________
$\color{green}\checkmark$ Linear homogeneous equation$:AX = 0$ is always consistent.
If $A$ is a square matrix of order $n$ and
$\color{green}\checkmark\:\mid A \mid = 0$, then the rows and columns are $\color{Teal} { \text{linearly dependent}}$ and system has a $\color{Magenta} { \text{nontrivial solution or infinite solution.}}$
$\color{green}\checkmark\:\mid A \mid \neq 0$, then the rows and columns are $\color{purple} {\text{linearly independent}}$ and system has a $\color{green} {\text{trivial solution or unique solution.}}$