The Gateway to Computer Science Excellence
0 votes
Let $A$ be a $n \times n$ square matrix whose all columns are independent. Is $Ax = b$ always solvable?

Actually, I know that $Ax= b$ is solvable if $b$ is in the column space of $A$. However, I am not sure if it is solvable for all values of $b$.
in Linear Algebra by Junior (693 points) | 99 views
yes, it is not solvable for all 'b'. 'b' should be in the same plane in which column vector of 'A' lies or we can say if we make 'b' by linear combination of column vectors of 'A' then it will be solvable.
In question , what x and b mean??
$x$ and $b$ are $n \times 1$ vectors.
then where is it not solvable?

Consider the following equation-

$\begin{bmatrix} 1 & 2 & 3\\ 4& 5 & 6\\ 7& 8 & 10 \end{bmatrix} \begin{bmatrix} x_{1}\\ x_{2}\\ x_{3} \end{bmatrix} = \begin{bmatrix} 1\\ 0\\ 0 \end{bmatrix}$

Here, we cannot get any value of $x_{1}$, $x_{2}$ and $x_{3}$ such that the above equation is satisfied.

Edit: This is solvable for $x = \begin{bmatrix} {-2/3}\\ {-2/3}\\ 1 \end{bmatrix}$. I did some mistake while trying to solve it using Gaussian Elimination Method.

On the other hand, if $b$ is in the column space of $A$, e.g. $b = \begin{bmatrix} 1\\ 4\\ 6 \end{bmatrix}$, we can get $x = \begin{bmatrix} 0\\ 2\\ -1 \end{bmatrix}$ such that $Ax =b$ is solvable.

what if b =$\begin{bmatrix} 1\\ 0\\ 1 \end{bmatrix}$
See here 3 equation and 3 unknown,

So, must be solvable

In b = $\begin{bmatrix} 1\\ 0\\ 0 \end{bmatrix}$ also there were 3 equations and 3 unknowns
yes that I told

solution always possible
Let $A = \begin{bmatrix} col_{1} & col_{2} & col_{3} \end{bmatrix}$ be any arbitrary $3 \times 3$ matrix.

Here, $col_{1}, col_{2}, col_{3}$ are the column vectors of A, each of dimension $3 \times 1$.

Also, let $x =\begin{bmatrix} x_{1}\\ x_{2}\\ x_{3} \end{bmatrix}$.

$ \therefore Ax = b \Rightarrow col_{1}.x_{1} + col_{2}.x_{2} + col_{3}.x_{3} = b$, i.e., a linear combination of the column vectors of $A$.

Therefore, finding the solution of the equation $Ax = b$ basically reduces to finding a linear combination of column vectors of matrix $A$ that result in vector $b$. Such a combination can only be found if vector $b$ lies in the column space of matrix $A$, otherwise not.
$\begin{bmatrix} 1 & 2 & 3\\ 4& 5 &6 \\ 7& 8 & 10 \end{bmatrix}$ $\begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix}$ $=$ $\begin{bmatrix} 1\\ 0\\ 0 \end{bmatrix}$

This has a solution of $\begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix}$ $=$ $\begin{bmatrix} -(2/3)\\ -(2/3)\\ 1 \end{bmatrix}$

yes right.


u r not correct here

1 Answer

+2 votes

Few Inferences first :

  • If the Rank of the Augmented Matrix $[A:b]$ $=$ Rank $[A]$ =$N$(Order of the matrix A) then it is solvable and can be solved by reducing by using Gaussian Elimination Method, and it has Unique Solution.
  • If the  Rank of the Augmented Matrix $[A:b]$  $=$ Rank $[A]$$ < N$ then it is solvable and has Infinitely many solutions.
  • If the  Rank of the Augmented Matrix $[A:b]$  $\neq$ Rank $[A]$ then it is not solvable.

Now as per your question, if A is a vector whose all columns are linearly independent then in that case it has a rank = $n$ and whatever may be $b$ the augmented matrix $[A:b]$ will always have a rank = $n$ hence it is always solvable and will have unique solution.

For more details you can refer to the lectures by Gilbert Strang or his book on linear algebra.

by Active (1.2k points)
Thanks. I was doing Gaussian Elimination but did some mistake.
Quick search syntax
tags tag:apple
author user:martin
title title:apple
content content:apple
exclude -tag:apple
force match +apple
views views:100
score score:10
answers answers:2
is accepted isaccepted:true
is closed isclosed:true
50,666 questions
56,167 answers
94,020 users