99 views
Let $A$ be a $n \times n$ square matrix whose all columns are independent. Is $Ax = b$ always solvable?

Actually, I know that $Ax= b$ is solvable if $b$ is in the column space of $A$. However, I am not sure if it is solvable for all values of $b$.
| 99 views
0
yes, it is not solvable for all 'b'. 'b' should be in the same plane in which column vector of 'A' lies or we can say if we make 'b' by linear combination of column vectors of 'A' then it will be solvable.
0
In question , what x and b mean??
0
$x$ and $b$ are $n \times 1$ vectors.
0
then where is it not solvable?
0

Consider the following equation-

$\begin{bmatrix} 1 & 2 & 3\\ 4& 5 & 6\\ 7& 8 & 10 \end{bmatrix} \begin{bmatrix} x_{1}\\ x_{2}\\ x_{3} \end{bmatrix} = \begin{bmatrix} 1\\ 0\\ 0 \end{bmatrix}$

Here, we cannot get any value of $x_{1}$, $x_{2}$ and $x_{3}$ such that the above equation is satisfied.

Edit: This is solvable for $x = \begin{bmatrix} {-2/3}\\ {-2/3}\\ 1 \end{bmatrix}$. I did some mistake while trying to solve it using Gaussian Elimination Method.

On the other hand, if $b$ is in the column space of $A$, e.g. $b = \begin{bmatrix} 1\\ 4\\ 6 \end{bmatrix}$, we can get $x = \begin{bmatrix} 0\\ 2\\ -1 \end{bmatrix}$ such that $Ax =b$ is solvable.

0
what if b =$\begin{bmatrix} 1\\ 0\\ 1 \end{bmatrix}$
0
See here 3 equation and 3 unknown,

So, must be solvable

right??
0
In b = $\begin{bmatrix} 1\\ 0\\ 0 \end{bmatrix}$ also there were 3 equations and 3 unknowns
0
yes that I told

solution always possible
0
Let $A = \begin{bmatrix} col_{1} & col_{2} & col_{3} \end{bmatrix}$ be any arbitrary $3 \times 3$ matrix.

Here, $col_{1}, col_{2}, col_{3}$ are the column vectors of A, each of dimension $3 \times 1$.

Also, let $x =\begin{bmatrix} x_{1}\\ x_{2}\\ x_{3} \end{bmatrix}$.

$\therefore Ax = b \Rightarrow col_{1}.x_{1} + col_{2}.x_{2} + col_{3}.x_{3} = b$, i.e., a linear combination of the column vectors of $A$.

Therefore, finding the solution of the equation $Ax = b$ basically reduces to finding a linear combination of column vectors of matrix $A$ that result in vector $b$. Such a combination can only be found if vector $b$ lies in the column space of matrix $A$, otherwise not.
0
$\begin{bmatrix} 1 & 2 & 3\\ 4& 5 &6 \\ 7& 8 & 10 \end{bmatrix}$ $\begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix}$ $=$ $\begin{bmatrix} 1\\ 0\\ 0 \end{bmatrix}$

This has a solution of $\begin{bmatrix} x_1\\ x_2\\ x_3 \end{bmatrix}$ $=$ $\begin{bmatrix} -(2/3)\\ -(2/3)\\ 1 \end{bmatrix}$
0

yes right.

@ankitgupta.1729

u r not correct here

Few Inferences first :

• If the Rank of the Augmented Matrix $[A:b]$ $=$ Rank $[A]$ =$N$(Order of the matrix A) then it is solvable and can be solved by reducing by using Gaussian Elimination Method, and it has Unique Solution.
• If the  Rank of the Augmented Matrix $[A:b]$  $=$ Rank $[A]$$< N$ then it is solvable and has Infinitely many solutions.
• If the  Rank of the Augmented Matrix $[A:b]$  $\neq$ Rank $[A]$ then it is not solvable.

Now as per your question, if A is a vector whose all columns are linearly independent then in that case it has a rank = $n$ and whatever may be $b$ the augmented matrix $[A:b]$ will always have a rank = $n$ hence it is always solvable and will have unique solution.

For more details you can refer to the lectures by Gilbert Strang or his book on linear algebra.

by Active (1.2k points)
0
Thanks. I was doing Gaussian Elimination but did some mistake.