edited by
20,128 views
67 votes
67 votes

Let $c_{1}.....c_{n}$ be scalars, not all zero, such that $\sum_{i=1}^{n}c_{i}a_{i}$ = 0 where $a_{i}$ are column vectors in $R^{n}$.

Consider the set of linear equations

$Ax = b$

where $A=\left [ a_{1}.....a_{n} \right ]$ and $b=\sum_{i=1}^{n}a_{i}$. The set of equations has

  1. a unique solution at $x=J_{n}$ where $J_{n}$ denotes a $n$-dimensional vector of all 1.
  2. no solution
  3. infinitely many solutions
  4. finitely many solutions
edited by

9 Answers

Best answer
83 votes
83 votes
$\sum_i c_i a_i = 0 \text{ with } \exists i: c_i \ne 0$ indicates that column vectors of $A$ are linearly dependent. Determinant of matrix $A$ would be zero.  Therefore either $Ax=b$ has no solution or infinitely many solutions. From $\sum_i  a_i = b$, it is clear that a $n$-dimensional vector of all $1$ is a solution of equation $Ax=b.$

Hence, $Ax=b$  will have infinitely many solutions. The correct answer is $(C)$.
edited by
138 votes
138 votes

   Answer is (C)

18 votes
18 votes

First read the question and then we'll answer gradually.

Given for some scalars, not all  , means  $\exists c_i :c_i \not=0$ for $1 \leq i \leq n$ 

$\sum_{i=1}^{n}c_i.a_i=0$ where $a_i$ are column vectors (of size nx1) in $R_n$

so this means

$c_1\begin{bmatrix} x_{1,1}\\ x_{2,1}\\ .\\ .\\ .\\ x_{n,1} \end{bmatrix}$ +$c_2\begin{bmatrix} x_{1,2}\\ x_{2,2}\\ .\\ .\\ .\\ x_{n,2} \end{bmatrix}+....+$$c_n\begin{bmatrix} x_{1,n}\\ x_{2,n}\\ .\\ .\\ .\\ x_{n,n} \end{bmatrix}=0$

so if I make a vector of all $c_i$ for $1 \leq i \leq n$, such that say vector $k=$$\begin{bmatrix} c_1\\ c_2\\ .\\ .\\ .\\ c_n \end{bmatrix}$, this $k$ would be a non-zero vector.

Now, given $A=[a_1...a_n]$, so I have a vector k such that $Ak=0$

or in other way, $|A-0.I|K=0$ and 0 is an eigenvalue of this matrix and hence, this matrix A has determinant 0.

Okay, now since $Ak=0$, for any constant $h \not=0,A(hk)=0$ and I choose the domain of h as over all real numbers except 0.

Now, given that $b=\sum_{i=1}^{n}a_i$ which means you sum up all the columns of A and you would get b

so

$A.\begin{bmatrix} 1\\ 1\\ 1\\ .\\ .\\ 1 \end{bmatrix}=b$ and let this nx1 dimensional vector of all 1's be say $x$, so $Ax=b$

Now, $Ax=b$ and $A(hk)=0$ imply $A(x+hk)=b$ for any-non zero constant h.

Hence, this system of equations has infinite solutions because your h is infinite.

Answer-(C)

4 votes
4 votes

A vector space can be of finite-dimension or infinite-dimension depending on the number of linearly independent basis vectors. The definition of linear dependence and the ability to determine whether a subset of vectors in a vector space is linearly dependent are central to determining a basis for a vector space

$\sum_{i=1}^{n}c_{i}a_{i}=0$

where $c_{i}$ scalar and $a_{i}$ vector

In case scalar multiplied with vector and get result 0, scalar value will be 0 

means $c_{i}$ value will be 0

and it will be linearly dependent

Again given $Ax=b$

A=$\left [ a_{1},a_{2},....a_{n} \right ]$

$b=\sum_{i=1}^{n}a_{i}$$=a_{1}+a_{2}+....+a_{n}$

that means there are only one solution

And all points of solution are on that line

So, ans will be there are infinitely many solution

________________________________________________________

  • Linear combination. In mathematics, a linear combination is an expression constructed from a set of terms by multiplying each term by a constant and adding the results (e.g. a linear combination of x and y would be any expression of the form ax + by, where a and b are constants
  • Linearly Dependent:Intuitively vectors being linearly independent means they represent independent directions in your vector spaces, while linearly dependent vectors means they don't. So for example if you have a set of vector {x1,...,x5} and you can walk some distance in the x1 direction, then a difference distance in x2, then again in the direction of x3. If in the end you are back where you started then the vectors are linearly dependent (notice that I did not use all the vectors).
  • https://math.stackexchange.com/questions/456002/what-exactly-does-linear-dependence-and-linear-independence-imply
  • http://onlinemschool.com/math/library/vector/linear-independence/

Note:

edited by
Answer:

Related questions

30 votes
30 votes
5 answers
2
Arjun asked Feb 14, 2017
13,889 views
Let $u$ and $v$ be two vectors in $\mathbf{R}^{2}$ whose Euclidean norms satisfy $\left \| u \right \| = 2\left \| v \right \|$. What is the value of $\alpha$ such that $...
35 votes
35 votes
6 answers
4
go_editor asked Sep 26, 2014
12,830 views
Consider the following system of equations: $3x + 2y = 1 $$4x + 7z = 1 $$x + y + z = 3$$x - 2y + 7z = 0$The number of solutions for this system is ______________