ago
26 views

1 Answer

0 votes
0 votes

If a square matrix $A_{n \times n}$ has $n$ linearly independent columns, then $det(A) \neq 0$.

proof: let a square matrix $A_{n\times n}$ with $n$ linearly independent columns.

we can perform gaussian elimination on $A$ to transform it into row echelon form. let's call this resulting matrix $R$.

we know that elementary row operations do not change the linear independence of columns.
and this is something we will take for granted.

therefore, $R$ also has $n$ linearly independent columns.

in row echelon form, linear independence of columns implies that $R$ is in fact in reduced row echelon form, which for a square matrix means it's the identity matrix $I$. now an obvious question would be why this is true ?

first, recall that a square matrix in row echelon form has these properties:

  • All zero rows, if any, are at the bottom.
  • The first non-zero element in each row (the pivot) is to the right of the pivot in the row above it.
  • All elements below a pivot are zero.

now, let's consider a square matrix $B$ in row echelon form with linearly independent columns.

we know that in a square matrix, the number of pivots equals the rank of the matrix.

since the columns are linearly independent, the rank of $B$ must equal its dimension (let's call it $n$).

This means $B$ must have $n$ pivots - one in each row and each column.

If $B$ was not the identity matrix, one of these scenarios would occur:

  1. a pivot is not 1
  2. there are non-zero elements above some pivots

Case 1: if a pivot is not 1, we can make it 1 by scaling its row. this doesn't affect linear independence.

Case 2: if there are non-zero elements above pivots, we can eliminate them using row operations, again not affecting linear independence of columns.

after these operations, we'd have the identity matrix.

if $B$ had a zero on its main diagonal, it would have fewer than $n$ pivots, contradicting the linear independence of its columns.

why a zero on the main diagonal contradicts linear independence ?

  • Suppose the $i^{th}$ diagonal element is zero.
  • The $i^{th}$ column would have zeros from row $i$ downward.
  • Its non-zero elements would align with pivots of columns to its left.
  • This means the $i^{th}$ column could be expressed as a linear combination of columns to its left.
  • This contradicts the assumption of linear independence.

example: $\begin{bmatrix}\alpha & 0 & 0 & a\\0 & \beta & 0 & b\\0 & 0 & \delta & c\\0 & 0 & 0 & 0\\\end{bmatrix}$

here we can clearly see that if diagonal element $b_{44} = 0$ then,

$\begin{bmatrix}a \\ b \\ c \end{bmatrix} = \begin{bmatrix}\alpha \\ 0 \\ 0 \end{bmatrix} + \begin{bmatrix}0 \\ \beta \\ 0 \end{bmatrix} + \begin{bmatrix}0 \\ 0 \\ \delta \end{bmatrix}$

and this makes column C4 linearly dependent on column C1, C2 and C3.

therefore, the only row echelon form that maintains linear independence of columns in a square matrix is the identity matrix.

now, let's consider how the determinant is affected by elementary row operations:

  • Swapping two rows multiplies the determinant by -1
  • Multiplying a row by a non-zero scalar multiplies the determinant by that scalar
  • Adding a multiple of one row to another doesn't change the determinant 

let's say we performed $k$ row swaps to get from $A$ to $R$. then:

$det(A) = (-1)^k \times \text{(product of non-zero scalar multiples used)} \times det(R)$

we know, $det(R) = det(I) = 1$

$det(A) = (-1)^k \times \text{(product of non-zero scalar multiples used)} \times 1$

this is a non-zero value, as it's the product of non-zero terms.

thus, we've shown that If a square matrix $A_{n \times n}$ has $n$ linearly independent columns, then $det(A) \neq 0$.

ago edited ago by

Related questions

32
views
1 answers
0 votes
anujs asked 2 days ago
32 views
if $det(A) \neq 0$ for a square matrix $A$, then a unique solution always exists for the system $Ax = b$ for any vector $b$. why ?
2.0k
views
1 answers
1 votes
learncp asked Aug 25, 2015
2,030 views
A be a n-square matrix with integer entries and B = A + 12 I. Then(a) B is idempotent      (b) B inverse exist(c) B is nilpotent       (d) B inverse is idempotent
1.6k
views
1 answers
3 votes
yg92 asked Feb 5, 2017
1,610 views
The number of linearly independent eigen vector for eigen value 1$\begin{bmatrix} 1 & 3 & 2 \\ 0 & 4 & 2\\ 0 &-3 & -1 \end{bmatrix}$Matrix After Substituing eigen value ... Matrix Rank = 1 ? The ans given is No Of Unknown - Rank ( 3-1 = 2)
1.5k
views
2 answers
0 votes
Pradip Nichite asked Jan 18, 2016
1,468 views
What is the determinant of matrix 2A. determinant of matrix A is 3. and IT is 4 by 4 matrix?