The Gateway to Computer Science Excellence
+17 votes
3.7k views

Consider the set of (column) vectors defined by$$X = \left \{x \in R^3 \mid x_1 + x_2 + x_3 = 0, \text{ where } x^T = \left[x_1,x_2,x_3\right]^T\right \}$$.Which of the following is TRUE?

  1. $\left\{\left[1,-1,0\right]^T,\left[1,0,-1\right]^T\right\}$ is a basis for the subspace $X$.
  2. $\left\{\left[1,-1,0\right]^T,\left[1,0,-1\right]^T\right\}$ is a linearly independent set, but it does not span $X$ and therefore is not a basis of $X$.
  3. $X$ is not a subspace of $R^3$.
  4. None of the above
in Linear Algebra by Veteran (52.2k points)
edited by | 3.7k views
0
@arjun sir
What is a subspace ? Is this in present syllabus ?
+1

Quoting from wikipedia:

If W is a vector space itself (which means that it is closed under operations of addition and scalar multiplication), with the same vector space operations as V has, then W is a subspace of V.

0
Can some one please explain question....?
What is  x={x Belongs R3} Here R3 means
+3
$R^3$ refers to the set of all possible 3-tuples $(x, y, z)$, where $x, y, z \in R$, $R$ denotes the set of real numbers.
0
why are they giving questions on vector space and all when it is not in syllabus???
0
are you sure that it is not in the syllabus ?!

4 Answers

+22 votes
Best answer

$\begin{vmatrix} 1 & -1 & 0\\ 1 & 0 & -1 \end{vmatrix} \ne 0$

Rank = 2 because size of largest square submatrix whose determinant value not equal to $0$ is $2.$ i.e. Both are linearly independent vectors

Given, $x_1 + x_2 + x_3 = 0 \implies x_3 = - (x_1 + x_2)$

$A= \begin{vmatrix} 1 & -1 & 0\\ 1 & 0 & -1\\ x_1 & x_2 & -(x_1+x_2) \end{vmatrix} = x_2 + 1(-x_1 -x_2) + x_1 =0$

which means Rank$(A) \ne 3 \implies$ there exist at least one vector which is linearly dependent. We already now that $\begin{bmatrix}1 & -1& 0\end{bmatrix}$ and $\begin{bmatrix}1 & 0& -1\end{bmatrix}$  are linearly independent.

So, vector $\begin{bmatrix}x_1 & x_2& x_3\end{bmatrix}$  where $x_1 + x_2 + x_3 = 0$ is linearly dependent on $\begin{bmatrix}1 & -1& 0\end{bmatrix}$ and $\begin{bmatrix}1 & 0& -1\end{bmatrix}$

Since all vectors of form $\begin{bmatrix}x_1 & x_2& x_3\end{bmatrix}$ where $x_1 + x_2 + x_3 = 0$ is a linear combination of $\begin{bmatrix}1 & -1& 0\end{bmatrix}$ and $\begin{bmatrix}1 & 0& -1\end{bmatrix}$, I can say that $\begin{bmatrix}1 & -1& 0\end{bmatrix}$ and $\begin{bmatrix}1 & 0& -1\end{bmatrix}$ is the basis of vector space, $X.$

So, (A) is TRUE.  

 

Subspace of a vector space

Any subset of a vector space(collection of vectors) is called as subspace iff

  1.  that subset contains zero vector
  2. that subset is closed under scalar multiplication 
  3. that subset is closed under addition of any $2$ vectors in that subset.

$X = \begin{bmatrix}x_1 & x_2& x_3 \end{bmatrix}$ where$ x_1 + x_2 x_3 = 0$

1. $X$ contains zero vector $\begin{bmatrix}0 & 0& 0 \end{bmatrix}$ as $0 + 0 + 0 = 0$

2. Let $C$ be a scalar (any real number)

$C\begin{bmatrix}x_1 & x_2& x_3 \end{bmatrix} = \begin{bmatrix}Cx_1 & Cx_2& Cx_3 \end{bmatrix} \in X$ because $Cx_1 + Cx_2 + Cx_3 = C (x_1 + x_2 + x_3) = C.0 = 0$.

3. Let $Y=\begin{bmatrix}y_1 \\ y_2\\  y_3\end{bmatrix} \in X$ where $y_1 + y_2 + y_3 = 0$ and $z=\begin{bmatrix}z_1 \\ z_2\\  z_3\end{bmatrix} \in X$ where$ z_1 + z_2 + z_3 = 0.$

$Y + Z  = \begin{bmatrix}y_1+ z_1 \\ y_2+ z_2\\  y_3 + z_3\end{bmatrix} \in X$ as

$y_1 + z_1+ y_2 + z_2 + y_3  + z_3$

$= (y_1 + y_2 + y_3) + (z_1 + z_2 + z_3) =0.$

So, clearly $X$ is subspace of $R^3$

So (A) is the answer.

by Loyal (8k points)
selected by
+1
Thank you very much for the detailed solution.
0

@Vicky rix  

condition for a Basis: i.t should span and not contain any redundant vector

                                  -vectors should be linearly independent.

we can prove 1st option as it contain linear independent vectors along with that  it spans X(sub space of X), directly with this we can eliminate option2(which is contradict of 1st option) and option3(we are successful in finding validity of 1,means statement 3 is trivial subspace of X)

can we do with this approach?

0
i am not getting how it will not span x  is answer a or b
0

@Kaluti

both vectors are linearly independent and span the whole subspace $X$. It means take the linear combination of both vectors and resultant vector will also be in the subspace $X$ which means it will satisfy the given condition.

0
It means when we take determinant of those two vectors with the subspace it will always come to be zero it means linear combination of those two vectors lies in that subspace am i correct
0
determinant is defined for square matrices. Here we have 2 vectors which have $3$ components.So, you can't say anything through determinant.

basis for a vector space means it contains right no.(neither less nor more) of linearly independent  vectors which span the whole vector space. It means you can take any linear combination of these linearly independent vector, resultant vector will also be in that space.
0
But including those two vectors as well as the third vector which satisfy the subspace if i include that then determinant will be zero right
0
0
so answer will be A right
0
yes
0
thanks
0

@Kaluti @ankitgupta.1729

basis means, there should be leading $1,$ in the span given.

ref here :https://yutsumura.com/find-a-basis-for-the-subspace-spanned-by-five-vectors/

right?

So, According to A), basis of subspace will be $2.$

But is it not possible here, in some case, basis of subspace $3?$

+1
basis of a vector space is a set of linearly  vectors which span that vector space.

basis is not unique but no. of vectors should be same in all bases.

In simple words, if we have a vector space which contains many vectors.. now choose some linearly independent vectors from that vector space and now if by making linear combination of these independent vectors, we can get all the vectors which are in that vector space. if we are able to do this then those linearly independent vectors form the basis for that vector space.
0

@ankitgupta.1729

from where r u reading it? Really hard to get this concept from Narsingh deo.

+1
@srestha mam, I followed Prof. Gilbert Strang's book and his video lectures for it..
0

@ankitgupta.1729

chk this from Deo book, it supporting my argument. In this vector $X,$ basis can be $3$ too.


but in the matrix

$\begin{bmatrix} 1&-1 &0 \\ 1&0&-1 \\ \end{bmatrix}$

with rowcolumn operation it gives $\begin{bmatrix} 1&0 &0 \\ 0&1&0 \\ \end{bmatrix}$

So, it is a basis of vector $X$

right?

0
mam, which row operations you have used ?

{(1,0,0),(0,1,0)} does not form a basis for subspace X because if you take linear combination as (1,0,0) + (0,1,0) = (1,1,0) but this vector is not in subspace X because sum of its components is not zero which is the requirement of subspace X. So, {(1,0,0),(0,1,0)} does not span the subspace X and hence it is not eligible to be called as basis for subspace X.
0
that resultant vector represents all possible vectors whether dependent or independent in that subspace am i right
0

@ankitgupta.1729

where u got linear combination always need to be $0?$ 

Basis vector is nothing but counting number of pivots.

right?

0
Here row space of the matrix is $R^{3}$ and column space would be $R^{2}$

right?

but as I told, it is not only basis of the matrix. Basis can change and it is one of them among this subspace.
0
column space $\in R$..

matrix should be [1,1,1] for the given subspace..check my answer below..

if we have to find no. of vectors in basis for column space then we have to consider no. of pivots which is same as no. of linearly independent vectors which is also same as rank of the matrix.. to find no. of vectors in basis for null space of the matrix, we have to do no. of columns of matrix -  rank of the matrix..
+6 votes

Option $(a)$, here is the answer,

http://math.stackexchange.com/a/1843452/153195

by Active (1.8k points)
edited by
0
Here from option we can conclude that the two given column vectors are linearly independent.

Let A be the first vector and B be second , k1,k2 are constant...
From k1A+k2B=0
we are getting k1=k2=0.

But how to analyse that these vectors span X or not?
0
What is a subspace ?
0
+4 votes

B I think is the answer we requreire two constant c1 and c2 which satisfy the R3 which is 
 

let X=

1 0 0
0 1 0
0 0 1

A=

1
-1
0

B=

1
0
-1

write it as echo-lean  form ABX it comes to be incostitant and hence it does not span.
 

by Active (3.8k points)
edited by
0
what is subspace ??(given in option)
0
$X = \{x_1, x_2, x_3: x_1 + x_2 + x_3 = 0\}$

It implies,

$X = \{-x_2 - x_3, x_2, x_3: x_2, x_3 \in R^3\}$

which can also be expressed as,

$\{x_2(-1, 1, 0) + x_3(-1, 0, 1): x_2, x_3 \in R^3\}$.

Equivalently, it also holds as,

$\{x_2(1, -1, 0) + x_3(1, 0, -1): x_2, x_3 \in R^3\}$.
0
i am not getting how it will not span x
+4 votes

Here, subspace $X \subseteq \mathbb{R}^{3}$ is defined as :-

$$X = \left \{x \in \mathbb{R}^3 \mid x_1 + x_2 + x_3 = 0, \text{ where } x^T = \left[x_1,x_2,x_3\right]^T\right \}$$.

which means sum of all the components of the vectors are zero which are in subspace $X$.

We can write  $ x_1 + x_2 + x_3 = 0$ as

                                                 $\begin{bmatrix} 1 &1 &1 \end{bmatrix}$$\begin{bmatrix} x_{1}\\x_2 \\ x_3 \end{bmatrix} = 0$

Now, assuming vector $x$ is in the nullspace of matrix $[1,1,1]$ and we have to find the basis for it.

Since, all the components of vector $x$ are zero. So, we can write $x_3 = -x_2 - x_1$

So,

$\begin{bmatrix} x_{1}\\x_2 \\ x_3 \end{bmatrix} = \begin{bmatrix} x_{1}\\x_2 \\ -x_2-x_1 \end{bmatrix} =x_1\begin{bmatrix} 1\\0 \\ -1 \end{bmatrix} + x_2\begin{bmatrix} 0\\1 \\-1 \end{bmatrix}$ where $x_1,x_2 \in \mathbb{R}$

So, vectors $\left \{ \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}, \begin{bmatrix} 0\\1 \\-1 \end{bmatrix} \right \}$ should be in the subspace $X.$

Now, we have to check whether these vectors are linearly independent or not.

So,

$k_1\begin{bmatrix} 1\\0 \\-1 \end{bmatrix} + k_2\begin{bmatrix} 0\\1 \\-1 \end{bmatrix} = \begin{bmatrix} 0\\0 \\0 \end{bmatrix}$

$\Rightarrow$ $k_1 = 0$ and $k_2 = 0$ which implies both vectors $\left \{ \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}, \begin{bmatrix} 0\\1 \\-1 \end{bmatrix} \right \}$ are linearly independent.

Since, $x \in \mathbb{R}^{3}$ and it is in nullspace which implies no. of columns($n$) = $3$ (it can be seen from the matrix $[1,1,1])$. 

Now, say, matrix $[1,1,1]$ is $A$. So, $rank(A)= r = 1$.

So, $dimension(N(A)) = n-r = 3-1 =2 $. It means basis for nullspace should have $2$ vectors. Basis contains linearly independent vectors which must span the whole vector space.

Hence, we can say that here basis has $2$ linearly independent vectors which are  $\left \{ \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}, \begin{bmatrix} 0\\1 \\-1 \end{bmatrix} \right \}$

So, Basis for $X$ is $\left \{ \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}, \begin{bmatrix} 0\\1 \\-1 \end{bmatrix} \right \}.$

Basis is not unique. We can have many bases for a given vector space but no. of vectors in all the bases should be same. Since, answer contains different basis. So, we have to change the basis for subspace $X$.

Since, Basis for $X$ is $\left \{ \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}, \begin{bmatrix} 0\\1 \\-1 \end{bmatrix} \right \}.$ It means vectors $\begin{bmatrix} 1\\0 \\-1 \end{bmatrix}\; and \; \begin{bmatrix} 0\\1 \\-1 \end{bmatrix}$ span the whole subspace $X$. So, on doing linear combination of both vectors, we will get another vector as :-

$1*\begin{bmatrix} 1\\0 \\-1 \end{bmatrix} + (-1)\begin{bmatrix} 0\\1 \\ -1 \end{bmatrix} = \begin{bmatrix} 1\\-1 \\0 \end{bmatrix}.$

So, vectors $\begin{bmatrix} 1\\-1 \\0 \end{bmatrix}\; and \; \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}$ will be in subspace $X.$ Now, we have to check whether these are linearly independent or not.

So, $k_1\begin{bmatrix} 1\\-1 \\ 0 \end{bmatrix} +k_2\begin{bmatrix} 1\\0 \\ -1 \end{bmatrix} = \begin{bmatrix} 0\\0 \\ 0 \end{bmatrix}$ which implies $k_1=0$ and $k_2=0.$

It means vectors $\begin{bmatrix} 1\\-1 \\0 \end{bmatrix}\; and \; \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}$ are linearly independent and form the basis for subspace $X.$

So, Another Basis for $X$ is $\left \{ \begin{bmatrix} 1\\-1 \\0 \end{bmatrix}, \begin{bmatrix} 1\\0 \\-1 \end{bmatrix} \right \}.$

Hence, Answer is $(A).$

by Boss (17.1k points)
edited by
+1

@ankitgupta.1729

u r taking $k_{1}=0,k_{2}=0$ and intentionally forming a matrix , which is linearly independent

$\left \{ \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}, \begin{bmatrix} 0\\1 \\-1 \end{bmatrix} \right \}$ vector otherwise never produce $\left \{ \begin{bmatrix} 0\\0 \\0 \end{bmatrix}\right \}$

right? but why?

another thing u got vectors $\left \{ \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}, \begin{bmatrix} 0\\1 \\-1 \end{bmatrix} \right \}$

how is it matching with $\left \{ \begin{bmatrix} 1\\-1 \\0 \end{bmatrix} \right \}^{T}$??

 

0

@srestha mam, Thanks. Yes, I got different basis which is not matching with the given options. Updated the answer.

0

I have got $3$ points

  • span is linear combination of vectors
  • Basis  is where span$\left ( v_{1}^{\rightarrow },v_{2}^{\rightarrow },........v_{n}^{\rightarrow } \right )$ linearly independent.
  • And every span has a valid subspace.

Now,  u firstly took $2$ combination and made $3rd$ combination from them and 

once u took $k_{1}=1,k_{2}=-1$, and another time $k_{1}=0,k_{2}=0$

These two point not clear to me.

Can we change $k_{1},k_{2}$ value independently?

Moreover, span in linear combination of two vectors and span in anyone of those vector gives same output?

0

once u took k1=1,k2=−1, and another time k1=0,k2=0

These two point not clear to me

I have written $k_1=0$ and $k_2=0$ for both bases.

Moreover, span in linear combination of two vectors and span in anyone of those vector gives same output?

Sorry, not getting you.

0

I have written k1=0 and k2=0 for both bases.

both bases means? 

+1
bases is the plural form of basis. I have written 1st basis as $\{(1,0,-1),(0,1,-1)\}$ and 2nd basis as $\{(1,-1,0),(1,0,-1)\}$ in the answer and in both cases $k_1$ and $k_2$ are zero.
+1

@ankitgupta.1729 

Are the following implications correct -

 

1. Basis of vector space $X$ means set of all linearly independent vectors, whose linear combinations will give $X$.

 

2. Just by looking at options, it can be deduced that $[\begin{matrix} 1 & 0 & -1\end{matrix}]$ cannot be expressed as $ k \times [\begin{matrix} 0 & 1 & -1 \end{matrix}], k\in R$ so they're linearly independent and a good case for being a basis. So, answer narrows down to options (A)and (B).

 

3. Now, they should be able to derive any vector in $X$ by linear combination.

[this is where I got stuck. is there an easy way to check this step? I would definitely mark (A) if this was asked in exam since I couldn't think of a vector that cant be generated by the given two. ]

 

+1

@toxicdesire

all points are correct.

Now, "whether a vector in X is in linear combination of those 2 vectors or not" can be easily seen by the step :

or, size of the spanning set of a vector space is always greater than or equal to dimension of that vector space.

0
Yes! I think that's what I indirectly did in my mind lol, that's neat.

Also, another thing.

Is cardinality of the basis always unique for a given vector space ?
+1
yes.
Quick search syntax
tags tag:apple
author user:martin
title title:apple
content content:apple
exclude -tag:apple
force match +apple
views views:100
score score:10
answers answers:2
is accepted isaccepted:true
is closed isclosed:true
50,737 questions
57,292 answers
198,218 comments
104,906 users