Log In
21 votes

Consider the set of (column) vectors defined by$$X = \left \{x \in R^3 \mid x_1 + x_2 + x_3 = 0, \text{ where } x^T = \left[x_1,x_2,x_3\right]^T\right \}$$.Which of the following is TRUE?

  1. $\left\{\left[1,-1,0\right]^T,\left[1,0,-1\right]^T\right\}$ is a basis for the subspace $X$.
  2. $\left\{\left[1,-1,0\right]^T,\left[1,0,-1\right]^T\right\}$ is a linearly independent set, but it does not span $X$ and therefore is not a basis of $X$.
  3. $X$ is not a subspace of $R^3$.
  4. None of the above
in Linear Algebra
edited by
@arjun sir
What is a subspace ? Is this in present syllabus ?

Quoting from wikipedia:

If W is a vector space itself (which means that it is closed under operations of addition and scalar multiplication), with the same vector space operations as V has, then W is a subspace of V.

Can some one please explain question....?
What is  x={x Belongs R3} Here R3 means
$R^3$ refers to the set of all possible 3-tuples $(x, y, z)$, where $x, y, z \in R$, $R$ denotes the set of real numbers.
why are they giving questions on vector space and all when it is not in syllabus???
are you sure that it is not in the syllabus ?!
Shouldn't the definition of X mean

X is a row vector and not a column vector as mentioned by them in the problem...

If it had to be a column vector then as per me X should have been define as;

X=[x1 x2 x3]T


XT=[x1 x2 x3]

(Am I missing any point plz correct)
Don’t go too deep friends simple answer is since both the vectors are linearly independent so they span the whole subspace & form basis.
@rishabhsharma, no, it is incorrect.
The basis of a vector space is a set of linearly independent vectors that span the whole space. Isn’t it correct definition of basis brother?
yes, it is the correct definition but what you said previously was not correct. If vectors are linearly independent then it does not mean that they will span the vector space. For example, here set $\{[1,-1,0]^T\}$ is linearly independent but does not span $X.$ There is a theorem which tells if we have a set $L=\{w_1,w_2,…..w_m\}$ be a linearly independent set then we can extend it to get a basis for the finite-dimensional vector space. So, in case of linearly independent sets, we add vectors to get the correct basis and in the case of spanning set, we remove vectors to get the correct basis.

I found this solution to be satisfactory and easily understandable :)

4 Answers

27 votes
Best answer

$\begin{vmatrix} 1 & -1 & 0\\ 1 & 0 & -1 \end{vmatrix} \ne 0$

Rank = 2 because size of largest square submatrix whose determinant value not equal to $0$ is $2.$ i.e. Both are linearly independent vectors

Given, $x_1 + x_2 + x_3 = 0 \implies x_3 = - (x_1 + x_2)$

$A= \begin{vmatrix} 1 & -1 & 0\\ 1 & 0 & -1\\ x_1 & x_2 & -(x_1+x_2) \end{vmatrix} = x_2 + 1(-x_1 -x_2) + x_1 =0$

which means Rank$(A) \ne 3 \implies$ there exist at least one vector which is linearly dependent. We already now that $\begin{bmatrix}1 & -1& 0\end{bmatrix}$ and $\begin{bmatrix}1 & 0& -1\end{bmatrix}$  are linearly independent.

So, vector $\begin{bmatrix}x_1 & x_2& x_3\end{bmatrix}$  where $x_1 + x_2 + x_3 = 0$ is linearly dependent on $\begin{bmatrix}1 & -1& 0\end{bmatrix}$ and $\begin{bmatrix}1 & 0& -1\end{bmatrix}$

Since all vectors of form $\begin{bmatrix}x_1 & x_2& x_3\end{bmatrix}$ where $x_1 + x_2 + x_3 = 0$ is a linear combination of $\begin{bmatrix}1 & -1& 0\end{bmatrix}$ and $\begin{bmatrix}1 & 0& -1\end{bmatrix}$, I can say that $\begin{bmatrix}1 & -1& 0\end{bmatrix}$ and $\begin{bmatrix}1 & 0& -1\end{bmatrix}$ is the basis of vector space, $X.$

So, (A) is TRUE.  


Subspace of a vector space

Any subset of a vector space(collection of vectors) is called as subspace iff

  1.  that subset contains zero vector
  2. that subset is closed under scalar multiplication 
  3. that subset is closed under addition of any $2$ vectors in that subset.

$X = \begin{bmatrix}x_1 & x_2& x_3 \end{bmatrix}$ where$ x_1 + x_2 x_3 = 0$

1. $X$ contains zero vector $\begin{bmatrix}0 & 0& 0 \end{bmatrix}$ as $0 + 0 + 0 = 0$

2. Let $C$ be a scalar (any real number)

$C\begin{bmatrix}x_1 & x_2& x_3 \end{bmatrix} = \begin{bmatrix}Cx_1 & Cx_2& Cx_3 \end{bmatrix} \in X$ because $Cx_1 + Cx_2 + Cx_3 = C (x_1 + x_2 + x_3) = C.0 = 0$.

3. Let $Y=\begin{bmatrix}y_1 \\ y_2\\  y_3\end{bmatrix} \in X$ where $y_1 + y_2 + y_3 = 0$ and $z=\begin{bmatrix}z_1 \\ z_2\\  z_3\end{bmatrix} \in X$ where$ z_1 + z_2 + z_3 = 0.$

$Y + Z  = \begin{bmatrix}y_1+ z_1 \\ y_2+ z_2\\  y_3 + z_3\end{bmatrix} \in X$ as

$y_1 + z_1+ y_2 + z_2 + y_3  + z_3$

$= (y_1 + y_2 + y_3) + (z_1 + z_2 + z_3) =0.$

So, clearly $X$ is subspace of $R^3$

So (A) is the answer.

selected by
Thank you very much for the detailed solution.

@Vicky rix  

condition for a Basis: i.t should span and not contain any redundant vector

                                  -vectors should be linearly independent.

we can prove 1st option as it contain linear independent vectors along with that  it spans X(sub space of X), directly with this we can eliminate option2(which is contradict of 1st option) and option3(we are successful in finding validity of 1,means statement 3 is trivial subspace of X)

can we do with this approach?

i am not getting how it will not span x  is answer a or b


both vectors are linearly independent and span the whole subspace $X$. It means take the linear combination of both vectors and resultant vector will also be in the subspace $X$ which means it will satisfy the given condition.

It means when we take determinant of those two vectors with the subspace it will always come to be zero it means linear combination of those two vectors lies in that subspace am i correct
determinant is defined for square matrices. Here we have 2 vectors which have $3$ components.So, you can't say anything through determinant.

basis for a vector space means it contains right no.(neither less nor more) of linearly independent  vectors which span the whole vector space. It means you can take any linear combination of these linearly independent vector, resultant vector will also be in that space.
But including those two vectors as well as the third vector which satisfy the subspace if i include that then determinant will be zero right
so answer will be A right

@Kaluti @ankitgupta.1729

basis means, there should be leading $1,$ in the span given.

ref here :


So, According to A), basis of subspace will be $2.$

But is it not possible here, in some case, basis of subspace $3?$

basis of a vector space is a set of linearly  vectors which span that vector space.

basis is not unique but no. of vectors should be same in all bases.

In simple words, if we have a vector space which contains many vectors.. now choose some linearly independent vectors from that vector space and now if by making linear combination of these independent vectors, we can get all the vectors which are in that vector space. if we are able to do this then those linearly independent vectors form the basis for that vector space.


from where r u reading it? Really hard to get this concept from Narsingh deo.

@srestha mam, I followed Prof. Gilbert Strang's book and his video lectures for it..


chk this from Deo book, it supporting my argument. In this vector $X,$ basis can be $3$ too.

but in the matrix

$\begin{bmatrix} 1&-1 &0 \\ 1&0&-1 \\ \end{bmatrix}$

with rowcolumn operation it gives $\begin{bmatrix} 1&0 &0 \\ 0&1&0 \\ \end{bmatrix}$

So, it is a basis of vector $X$


mam, which row operations you have used ?

{(1,0,0),(0,1,0)} does not form a basis for subspace X because if you take linear combination as (1,0,0) + (0,1,0) = (1,1,0) but this vector is not in subspace X because sum of its components is not zero which is the requirement of subspace X. So, {(1,0,0),(0,1,0)} does not span the subspace X and hence it is not eligible to be called as basis for subspace X.
that resultant vector represents all possible vectors whether dependent or independent in that subspace am i right


where u got linear combination always need to be $0?$ 

Basis vector is nothing but counting number of pivots.


Here row space of the matrix is $R^{3}$ and column space would be $R^{2}$


but as I told, it is not only basis of the matrix. Basis can change and it is one of them among this subspace.
column space $\in R$..

matrix should be [1,1,1] for the given subspace..check my answer below..

if we have to find no. of vectors in basis for column space then we have to consider no. of pivots which is same as no. of linearly independent vectors which is also same as rank of the matrix.. to find no. of vectors in basis for null space of the matrix, we have to do no. of columns of matrix -  rank of the matrix..
6 votes

Option $(a)$, here is the answer,

edited by
Here from option we can conclude that the two given column vectors are linearly independent.

Let A be the first vector and B be second , k1,k2 are constant...
From k1A+k2B=0
we are getting k1=k2=0.

But how to analyse that these vectors span X or not?
What is a subspace ?
6 votes

Here, subspace $X \subseteq \mathbb{R}^{3}$ is defined as :-

$$X = \left \{x \in \mathbb{R}^3 \mid x_1 + x_2 + x_3 = 0, \text{ where } x^T = \left[x_1,x_2,x_3\right]^T\right \}$$.

which means sum of all the components of the vectors are zero which are in subspace $X$.

We can write  $ x_1 + x_2 + x_3 = 0$ as

                                                 $\begin{bmatrix} 1 &1 &1 \end{bmatrix}$$\begin{bmatrix} x_{1}\\x_2 \\ x_3 \end{bmatrix} = 0$

Now, assuming vector $x$ is in the nullspace of matrix $[1,1,1]$ and we have to find the basis for it.

Since, all the components of vector $x$ are zero. So, we can write $x_3 = -x_2 - x_1$


$\begin{bmatrix} x_{1}\\x_2 \\ x_3 \end{bmatrix} = \begin{bmatrix} x_{1}\\x_2 \\ -x_2-x_1 \end{bmatrix} =x_1\begin{bmatrix} 1\\0 \\ -1 \end{bmatrix} + x_2\begin{bmatrix} 0\\1 \\-1 \end{bmatrix}$ where $x_1,x_2 \in \mathbb{R}$

So, vectors $\left \{ \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}, \begin{bmatrix} 0\\1 \\-1 \end{bmatrix} \right \}$ should be in the subspace $X.$

Now, we have to check whether these vectors are linearly independent or not.


$k_1\begin{bmatrix} 1\\0 \\-1 \end{bmatrix} + k_2\begin{bmatrix} 0\\1 \\-1 \end{bmatrix} = \begin{bmatrix} 0\\0 \\0 \end{bmatrix}$

$\Rightarrow$ $k_1 = 0$ and $k_2 = 0$ which implies both vectors $\left \{ \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}, \begin{bmatrix} 0\\1 \\-1 \end{bmatrix} \right \}$ are linearly independent.

Since, $x \in \mathbb{R}^{3}$ and it is in nullspace which implies no. of columns($n$) = $3$ (it can be seen from the matrix $[1,1,1])$. 

Now, say, matrix $[1,1,1]$ is $A$. So, $rank(A)= r = 1$.

So, $dimension(N(A)) = n-r = 3-1 =2 $. It means basis for nullspace should have $2$ vectors. Basis contains linearly independent vectors which must span the whole vector space.

Hence, we can say that here basis has $2$ linearly independent vectors which are  $\left \{ \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}, \begin{bmatrix} 0\\1 \\-1 \end{bmatrix} \right \}$

So, Basis for $X$ is $\left \{ \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}, \begin{bmatrix} 0\\1 \\-1 \end{bmatrix} \right \}.$

Basis is not unique. We can have many bases for a given vector space but no. of vectors in all the bases should be same. Since, answer contains different basis. So, we have to change the basis for subspace $X$.

Since, Basis for $X$ is $\left \{ \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}, \begin{bmatrix} 0\\1 \\-1 \end{bmatrix} \right \}.$ It means vectors $\begin{bmatrix} 1\\0 \\-1 \end{bmatrix}\; and \; \begin{bmatrix} 0\\1 \\-1 \end{bmatrix}$ span the whole subspace $X$. So, on doing linear combination of both vectors, we will get another vector as :-

$1*\begin{bmatrix} 1\\0 \\-1 \end{bmatrix} + (-1)\begin{bmatrix} 0\\1 \\ -1 \end{bmatrix} = \begin{bmatrix} 1\\-1 \\0 \end{bmatrix}.$

So, vectors $\begin{bmatrix} 1\\-1 \\0 \end{bmatrix}\; and \; \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}$ will be in subspace $X.$ Now, we have to check whether these are linearly independent or not.

So, $k_1\begin{bmatrix} 1\\-1 \\ 0 \end{bmatrix} +k_2\begin{bmatrix} 1\\0 \\ -1 \end{bmatrix} = \begin{bmatrix} 0\\0 \\ 0 \end{bmatrix}$ which implies $k_1=0$ and $k_2=0.$

It means vectors $\begin{bmatrix} 1\\-1 \\0 \end{bmatrix}\; and \; \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}$ are linearly independent and form the basis for subspace $X.$

So, Another Basis for $X$ is $\left \{ \begin{bmatrix} 1\\-1 \\0 \end{bmatrix}, \begin{bmatrix} 1\\0 \\-1 \end{bmatrix} \right \}.$

Hence, Answer is $(A).$

edited by


u r taking $k_{1}=0,k_{2}=0$ and intentionally forming a matrix , which is linearly independent

$\left \{ \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}, \begin{bmatrix} 0\\1 \\-1 \end{bmatrix} \right \}$ vector otherwise never produce $\left \{ \begin{bmatrix} 0\\0 \\0 \end{bmatrix}\right \}$

right? but why?

another thing u got vectors $\left \{ \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}, \begin{bmatrix} 0\\1 \\-1 \end{bmatrix} \right \}$

how is it matching with $\left \{ \begin{bmatrix} 1\\-1 \\0 \end{bmatrix} \right \}^{T}$??



@srestha mam, Thanks. Yes, I got different basis which is not matching with the given options. Updated the answer.


I have got $3$ points

  • span is linear combination of vectors
  • Basis  is where span$\left ( v_{1}^{\rightarrow },v_{2}^{\rightarrow },........v_{n}^{\rightarrow } \right )$ linearly independent.
  • And every span has a valid subspace.

Now,  u firstly took $2$ combination and made $3rd$ combination from them and 

once u took $k_{1}=1,k_{2}=-1$, and another time $k_{1}=0,k_{2}=0$

These two point not clear to me.

Can we change $k_{1},k_{2}$ value independently?

Moreover, span in linear combination of two vectors and span in anyone of those vector gives same output?


once u took k1=1,k2=−1, and another time k1=0,k2=0

These two point not clear to me

I have written $k_1=0$ and $k_2=0$ for both bases.

Moreover, span in linear combination of two vectors and span in anyone of those vector gives same output?

Sorry, not getting you.


I have written k1=0 and k2=0 for both bases.

both bases means? 

bases is the plural form of basis. I have written 1st basis as $\{(1,0,-1),(0,1,-1)\}$ and 2nd basis as $\{(1,-1,0),(1,0,-1)\}$ in the answer and in both cases $k_1$ and $k_2$ are zero.


Are the following implications correct -


1. Basis of vector space $X$ means set of all linearly independent vectors, whose linear combinations will give $X$.


2. Just by looking at options, it can be deduced that $[\begin{matrix} 1 & 0 & -1\end{matrix}]$ cannot be expressed as $ k \times [\begin{matrix} 0 & 1 & -1 \end{matrix}], k\in R$ so they're linearly independent and a good case for being a basis. So, answer narrows down to options (A)and (B).


3. Now, they should be able to derive any vector in $X$ by linear combination.

[this is where I got stuck. is there an easy way to check this step? I would definitely mark (A) if this was asked in exam since I couldn't think of a vector that cant be generated by the given two. ]




all points are correct.

Now, "whether a vector in X is in linear combination of those 2 vectors or not" can be easily seen by the step :

or, size of the spanning set of a vector space is always greater than or equal to dimension of that vector space.

Yes! I think that's what I indirectly did in my mind lol, that's neat.

Also, another thing.

Is cardinality of the basis always unique for a given vector space ?
4 votes

B I think is the answer we requreire two constant c1 and c2 which satisfy the R3 which is 

let X=

1 0 0
0 1 0
0 0 1





write it as echo-lean  form ABX it comes to be incostitant and hence it does not span.

edited by
what is subspace ??(given in option)
$X = \{x_1, x_2, x_3: x_1 + x_2 + x_3 = 0\}$

It implies,

$X = \{-x_2 - x_3, x_2, x_3: x_2, x_3 \in R^3\}$

which can also be expressed as,

$\{x_2(-1, 1, 0) + x_3(-1, 0, 1): x_2, x_3 \in R^3\}$.

Equivalently, it also holds as,

$\{x_2(1, -1, 0) + x_3(1, 0, -1): x_2, x_3 \in R^3\}$.
i am not getting how it will not span x

Related questions

20 votes
3 answers
Let $u$ and $v$ be two vectors in R2 whose Euclidean norms satisfy $\left \| u \right \| = 2\left \| v \right \|$. What is the value of $\alpha$ such that $w = u + \alpha v$ bisects the angle between $u$ and $v$? $2$ $\frac{1}{2}$ $1$ $\frac{ -1}{2}$
asked Feb 14, 2017 in Linear Algebra Arjun 7.5k views
9 votes
2 answers
A unit vector perpendicular to both the vectors $a=2i-3j+k$ and $b=i+j-2k$ is: $\frac{1}{\sqrt{3}} (i+j+k)$ $\frac{1}{3} (i+j-k)$ $\frac{1}{3} (i-j-k)$ $\frac{1}{\sqrt{3}} (i+j-k)$
asked Oct 8, 2014 in Linear Algebra Kathleen 1.7k views
20 votes
2 answers
If $V_1$ and $V_2$ are $4$-dimensional subspaces of a $6$-dimensional vector space $V$, then the smallest possible dimension of $V_1 \cap V_2$ is _____.
asked Sep 28, 2014 in Linear Algebra jothee 4k views
1 vote
1 answer
If the linear velocity $\vec V$ is given by $\vec V = x^2y\,\hat i + xyz\,\hat j – yz^2\,\hat k$ The angular velocity $\vec \omega$ at the point $(1, 1, -1)$ is ________
asked Sep 13, 2014 in Linear Algebra Kathleen 317 views