All matrices we talk about here are $n \times n$ square matrices over a numeric field.

In linear algebra the definition of the concept **invertible matrix **is usually given this way.

**Def.:** A matrix is said to be **invertible** if there exists a matrix B such that $AB=BA=I$ where I is the identity matrix. In this case **B** is called the inverse matrix of **A**.

But I somehow don't like this definition, it seems too strong to me since it implies B is both left inverse (meaning $BA=I$) and right inverse (meaning $AB=I$) of the matrix A.

So let us try to introduce this concept in a somewhat different way.

**Def.1: **A matrix is said to be **left invertible** if there exists a matrix B such that $BA=I$ where I is the identity matrix. In this case **B** is called **left inverse** matrix of **A**.

**Def.2: **A matrix is said to be **right invertible** if there exists a matrix B such that $AB=I$ where I is the identity matrix. In this case **B** is called **right inverse** matrix of **A**.

We still don't know if left/right inverses of a matrix A exist and under what conditions. We also don't know if they are unique (in case they exist).

Now we will prove a few statements to clarify all this.

**Th1: **If a matrix A is left (or right) invertible then $\det (A) \ne 0$

**Proof: **We know that $\det (XY) = \det(X) \cdot \det(Y)$

If A is left invertible then there exists a matrix B such that $BA = I$. But then $1 = \det(I) = \det(BA) = \det(B) \cdot \det(A)$ And now it follows that $\det(A) \ne 0$

If A is right invertible then there exists a matrix B such that $AB = I$. But then $1 = \det(I) = \det(AB) = \det(A) \cdot \det(B)$ And now again it follows that $\det(A) \ne 0$

**Th2:** If $\det(A) \ne 0$ then A is left and right invertible.

**Proof: **The proof here is done by construction. If $A=(a_{ij})$, we construct the matrix $S=(A_{ji})$ which is the matrix formed by **the cofactors** of $A$ **transposed**. Then one easily shows (using previous theory from linear algebra) that the matrix $T = \frac{1}{\det(A)} \cdot S$ satisfies both $TA=I$ and $AT=I$

So far we proved that a matrix A is left/right invertible **if and only if **$\det(A) \ne 0$ In the case when $\det(A) \ne 0$, we also showed how one **left inverse** and one **right inverse** can be constructed i.e. we showed **existence of the left/right inverses** (the matrix T is both left and right inverse of A).

This construction of $T$ (from $A$) is important so we will keep denoting this so-constructed matrix as $T$ for the rest of this post.

**Th3: **For each matrix $A$ with $\det(A) \ne 0$ there is a unique left inverse and a unique right inverse and they are both equal to the above constructed matrix $T$.

**Proof: **Let's assume that $BA=CA=I$ for some matrices $B,C$ - left inverses of $A$.

Then $T=IT=(BA)T=B(AT)=BI = B$

And also $T=IT=(CA)T=C(AT)=CI = C$

OK, so it follows that $B=C$ (and both B and C are equal to that special matrix T). This proves the uniqueness of the **left inverse**.

**Note that to prove the uniqueness of the left inverse we used the existence of the right inverse T of A. **

The uniqueness of the **right inverse **is proved in the same way.

Let's assume that $AB'=AC'=I$ for some matrices $B',C'$ - right inverses of $A$.

Then $T=TI=T(AB')=(TA)B'=IB' = B'$

Also $T=TI=T(AC')=(TA)C'=IC' = C'$

So it follows that $B'=C'$ (and both B' and C' are equal to that special matrix T). This proves the uniqueness of the **right inverse**.

**Note that to prove the uniqueness of the right inverse we used the existence of the left inverse T of A. **

We are done. Now we have everything introduced in a clear way. We proved that A is left/right invertible if and only if its determinant is non-zero. And we proved that in that case (when the determinant is non-zero) the left/right inverses exists (T), and also that they are unique and coincide (both are equal to T).

## No comments:

## Post a Comment