If a square matrix $A_{n \times n}$ has $n$ linearly independent columns, then $det(A) \neq 0$.
proof: let a square matrix $A_{n\times n}$ with $n$ linearly independent columns.
we can perform gaussian elimination on $A$ to transform it into row echelon form. let's call this resulting matrix $R$.
we know that elementary row operations do not change the linear independence of columns.
and this is something we will take for granted.
therefore, $R$ also has $n$ linearly independent columns.
in row echelon form, linear independence of columns implies that $R$ is in fact in reduced row echelon form, which for a square matrix means it's the identity matrix $I$. now an obvious question would be why this is true ?
first, recall that a square matrix in row echelon form has these properties:
- All zero rows, if any, are at the bottom.
- The first non-zero element in each row (the pivot) is to the right of the pivot in the row above it.
- All elements below a pivot are zero.
now, let's consider a square matrix $B$ in row echelon form with linearly independent columns.
we know that in a square matrix, the number of pivots equals the rank of the matrix.
since the columns are linearly independent, the rank of $B$ must equal its dimension (let's call it $n$).
This means $B$ must have $n$ pivots - one in each row and each column.
If $B$ was not the identity matrix, one of these scenarios would occur:
- a pivot is not 1
- there are non-zero elements above some pivots
Case 1: if a pivot is not 1, we can make it 1 by scaling its row. this doesn't affect linear independence.
Case 2: if there are non-zero elements above pivots, we can eliminate them using row operations, again not affecting linear independence of columns.
after these operations, we'd have the identity matrix.
if $B$ had a zero on its main diagonal, it would have fewer than $n$ pivots, contradicting the linear independence of its columns.
why a zero on the main diagonal contradicts linear independence ?
- Suppose the $i^{th}$ diagonal element is zero.
- The $i^{th}$ column would have zeros from row $i$ downward.
- Its non-zero elements would align with pivots of columns to its left.
- This means the $i^{th}$ column could be expressed as a linear combination of columns to its left.
- This contradicts the assumption of linear independence.
example: $\begin{bmatrix}\alpha & 0 & 0 & a\\0 & \beta & 0 & b\\0 & 0 & \delta & c\\0 & 0 & 0 & 0\\\end{bmatrix}$
here we can clearly see that if diagonal element $b_{44} = 0$ then,
$\begin{bmatrix}a \\ b \\ c \end{bmatrix} = \begin{bmatrix}\alpha \\ 0 \\ 0 \end{bmatrix} + \begin{bmatrix}0 \\ \beta \\ 0 \end{bmatrix} + \begin{bmatrix}0 \\ 0 \\ \delta \end{bmatrix}$
and this makes column C4 linearly dependent on column C1, C2 and C3.
therefore, the only row echelon form that maintains linear independence of columns in a square matrix is the identity matrix.
now, let's consider how the determinant is affected by elementary row operations:
- Swapping two rows multiplies the determinant by -1
- Multiplying a row by a non-zero scalar multiplies the determinant by that scalar
- Adding a multiple of one row to another doesn't change the determinant
let's say we performed $k$ row swaps to get from $A$ to $R$. then:
$det(A) = (-1)^k \times \text{(product of non-zero scalar multiples used)} \times det(R)$
we know, $det(R) = det(I) = 1$
$det(A) = (-1)^k \times \text{(product of non-zero scalar multiples used)} \times 1$
this is a non-zero value, as it's the product of non-zero terms.
thus, we've shown that If a square matrix $A_{n \times n}$ has $n$ linearly independent columns, then $det(A) \neq 0$.