5.7k views

Consider a matrix P whose only eigenvectors are the multiples of $\begin{bmatrix} 1 \\ 4 \end{bmatrix}$.

Consider the following statements.

1. P does not have an inverse
2. P has a repeated eigenvalue
3. P cannot be diagonalized

Which one of the following options is correct?

1. Only I and III are necessarily true
2. Only II is necessarily true
3. Only I and II are necessarily true
4. Only II and III are necessarily true

edited | 5.7k views
+2
option D is correct
0
0
sir, i will write shortly.
0

Thanks.

+4
We know that if the eigenvalues of a matrix are distinct, then the eigenvectors are independent. When we take the contrapositive of this, it says that if the eigenvectors are not independent, then the eigenvalues are not distinct.

Also, a matrix is diagonlisable if and only if a matrix has $n$ independent eigenvectors. Thus, this matrix cannot be diagonalised. Also since the eigenvectors are not independent, the eigenvalues have to repeat.
+2
Read Theorem no 168, pg no 7

web.mit.edu/14.102/www/notes/lecturenotes0927.pdf
+1

This video covers this question quite well

Theorem: Suppose the $n \times n$ matrix $A$ has $n$ linearly independent eigenvectors. If these eigenvectors are the columns of a matrix $S,$ then $S^{-1}AS$ is a diagonal matrix $\Lambda$. The eigenvalues of $A$ are on the diagonal of $\Lambda$.

$S^{-1}AS = \Lambda$ (A diagonal Matrix with diagonal values representing eigen values of A)$= \begin{bmatrix} \lambda _{1} & & & & \\ & \lambda _{2} & & & \\ & & . & & \\ & & & . & \lambda _{n}\\ & & & & \end{bmatrix}$

Now if $A$ is diagonalizable, $S^{-1}$ must exist. What is $S?$ $S$ is a matrix whose columns are eigen-vectors of matrix $A.$

Now $S^{-1}$ would only exist if $S$ is invertible. And $S$ would be invertible if all rows and columns of $S$ are independent.

If we would have same eigen-vectors, then $S^{-1}$ won't exist and hence $A$ won't be diagonalizable.

Even if a matrix $A$ has same eigen values, it does not mean that it is not diagonizable. Take the trivial example of the Identity Matrix $I.$

$\begin{bmatrix} 1 & & \\ & 1& \\ & & 1 \end{bmatrix}$

The Eigen values are $1,1,1$ and Eigen vectors are $\begin{bmatrix} 1\\ 0\\ 0\end{bmatrix}$ , $\begin{bmatrix} 0\\ 1\\ 0\end{bmatrix}$, $\begin{bmatrix} 0\\ 0\\ 1\end{bmatrix}$

We form $S=\begin{bmatrix} 1 & 0 &0 \\ 0 & 1 &0 \\ 0 & 0& 1 \end{bmatrix}$ and this is invertible.

So, this makes $S^{-1} I S$ as $I =\Lambda$ which is in-fact a diagonal matrix.

So, even if we have same eigenvalues the matrix may or may not be diagonalizable. But yes, we need full $n$ set of linearly independent eigenvectors for this matrix $A$ of size $n \times n$ to be diagonalizable.

Now our problem says that we have a matrix $P$ whose only eigenvectors are multiples of $\begin{bmatrix} 1\\ 4 \end{bmatrix}$.

Means we have only  $\begin{bmatrix} 1\\ 4 \end{bmatrix}$ as independent eigen vector. Surely, this matrix is not diagonalizable.

Since, eigen vectors are multiples of  $\begin{bmatrix} 1\\ 4 \end{bmatrix}$, we have repeated eigen values.

Let us assume $\lambda _{1}$ and $\lambda _{2}$ are different eigen values.

Since, we have only eigen vectors multiple of $\begin{bmatrix} 1\\ 4 \end{bmatrix}$.

Let this vector be $x_1.$

So, $Px_1=\lambda _{1}x_1$

$\implies Px_1=\lambda _{2}x_1$

$\implies \lambda _{1}x_1=\lambda _{2}x_1$

Multiply by $x_1-1$ we get $\lambda _{1} =\lambda _{2}.$ Means, same Eigen vector gives same Eigen Value.

So yes, $P$ has repeated Eigen Values. (II) statement is true.

Now Statement (I) is not true. We cannot say this statement with exact surety.

Consider a matrix $A=\begin{bmatrix} 0 & 1\\ 0 & 0 \end{bmatrix}$

It's eigenvales are $\lambda _{1}=\lambda _{2}=0$.

All the eigenvectors of this $A$ are multiples of vector $(1,0)$

$\begin{bmatrix} 0 & 1\\ 0 & 0 \end{bmatrix}. x=\begin{bmatrix} 0\\ 0 \end{bmatrix}$

or $x=\begin{bmatrix} c\\ 0 \end{bmatrix}$.

This matrix surely is not diagonalizable but this matrix $A$ has determinant $= -1.$

Since determinant is not $0,$ so $A^{-1}$ exists.

Hence, (II) and (III) are surely valid under all cases of this question.

For (I) part, Matrix $P$ will not have an inverse when $det(P)=0$ and this implies one of the eigenvalue of $P$ is zero. But in question, no where it is mentioned about what are the eigen values. So, I is not necessarily true.

Answer-(D) (Remember options says "necessarily true")

by Boss (27.3k points)
edited by
+4

Is it like this given that all eigen vectors are multiples $\begin{bmatrix} 1\\ 4\\ \end{bmatrix}$ of

i.e   k$\begin{bmatrix} 1\\ 4\\ \end{bmatrix}$ , k>0

==> all eigen vectors are linearly dependent.Everyone is linear combination of other

So no chance of distinct eigen values because eigen vectors correcponding to distinct eigen values are independent

==> There must be repeating eigen values

..

0
yes
0

If we would have same eigen-vectors, then S-1 won't exist and hence A won't be diagonalizable.

0

1. P cannot be diagonalized

doesnot it implies,  "P does not have an inverse"

+1

@srestha-No.For that you need to prove that 0 is one of the eigen value of P.Here nothing is talked about what matrix P is or how it's Eigen values are.

0

@jk_1-See your matrix S is nothing but the Eigenvectors of A.The columns of S are actually the eigen vectors of the Matrix A.So, if you have repeated eigen vectors, your S matrix won't have full set of n independent columns, and in that case rank of S would be less than n.Hence the determinant of S will be 0 and hence $S^{-1}$ won't exist.

+1
how matrix A has determinant -1?
0
Did I somewhere talked about -1 as determinant? :O
0
"This matrix surely is not diagonalizable but our this matrix A has determinant = -1"
0

@gilgamesh-I have not talked about the matrix given in question.I have taken a sample matrix A as shown in answer for that it is written.

0
$A = \begin{bmatrix} 0&1 \\ 0 &0 \end{bmatrix}$

This matrix?
0
Sir, your example for " statement: P does not have an inverse" is not clear..
0
I hope it must be clear now.
0

Now Statement (I) is not true.We cannot say this statement with exact surety.

Consider a matrix A=$\begin{bmatrix} 0 & 1\\ 0 & 0 \end{bmatrix}$

It's eigenvales are $\lambda _{1}=\lambda _{2}=0$.

All the eigenvectors of this A are multiples of vector(1,0)

$\begin{bmatrix} 0 & 1\\ 0 & 0 \end{bmatrix}$ . x=$\begin{bmatrix} 0\\ 0 \end{bmatrix}$

or x=$\begin{bmatrix} c\\ 0 \end{bmatrix}$.

This matrix surely is not diagonalizable but our this matrix A has determinant = -1.

Since determinant is not 0, so $A^{-1}$exists.

+1

"​​​​​​Inverse of P does not exist only if one of the $\lambda=0$  "

I think this statement is enough to prove that statement 1 is not necessarily correct as nothing is given about eigen value of P

0
The example should have been

[11

0 1]

It is given that the only eigen vector of the matrix P is [1 4]T, from this we can conclude that the matrix is atleast an 2x2 matrix, since the eigen vector given has 'm' components which we can take as 'm'  and since it is not possible to find eigen values and eigen vectors for rectangular matrices since they are not defined, minimum value of 'n' can be taken as 2.

Now we can eliminate option 1 : Because we can't say for sure whether the matrix has an inverse or not from the given eigen vector.

Option 2 : Since the minimum matrix possible is 2x2 and it is given that there is only one eigen vector, we can conclude that the matrix has repeated eigen values because, if the  matrix has distinct eigen values, then we would have got a characteristic equation in second degree polynomial with distinct roots and so there would have been 2 distinct eigen vectors. However we have only one eigen vector, So option 2 is correct

Option 3 : Generally, an (n x n) matrix with repeated eigen values can be diagonalised if we obtain n linearly independent eigen vectors for it. This will be the case; if for each repeated eigen value(i) of algebraic multiplicity(num. of times, the eigen value is repeated) N, we get 'N' linearly independent eigen vectors. So, in the options provided, it is given that "Option 3 is necessary true", we can conclude that option d) is correct. Because repeated eigen values are just necessary but not sufficient condition to say whether a matrix is diagonalisable or not.

Please note that there are some matrices which have repeated eigen values but still can be diagonalisable. So just because you have repeated eigen values, you can't conclude that the matrix can't be diagonalisable. So, the usage of the word "necessarily" in the option d) makes it the correct one. if "necessarily" has not been used, we can't conclude that option d) is the correct answer.

by (277 points)
+2
here ....

even the eigen vectors also repeated ( geometric multiplicity is greater than 1 ).

For diagnolisation necessary and sufficient condition is to have n lineary independt eigen vectors.

But here we have only 1 linear independent eigen vector ( remaining are multiple of it ).

Hence diagnolisation is not possible.

Eigenvectors are multiple of $\begin{bmatrix} 1\\ 4 \end{bmatrix}$ ,

So, it has repeated eigen value.

Hence, It cannot be Diagonalizable since repeated eigenvalue, [ we know if distinct eigen vector then diagonalizable]

$D$ must be answer.

by Veteran (62.7k points)
edited
0

Is there any way to prove that P can not be diagonalize? If yes, please answer according to the last example given on the following document.

http://www.math.tamu.edu/~julia/Teaching/diagonalization_Narcowich.pdf

+6
I got my answer from the following statement "A matrix is diagonalizable iff  algebraic multiplicity equals to geometric multiplicity for each eigen value.
0
0

@Prashant
chk here

If there is a repeated eigenvalue, whether or not the matrix can be diagonalised depends on the eigenvectors

0
It is an ambiguous question

why not it is marks to all?
0
I think it's not an ambiguous question. coz in the D option, it is given that "it is necessarily true". Indeed it is. For a matrix to not be diagonalizable, necessary condition should be that "it should have repeated eigen values". Note that it is only an necessary condition, not a sufficient one.
0

I am getting confused regarding statement III in the question.
IN
SHORT CAN ANYONE EXPLAIN : "What is the necessary and sufficient condition for a matrix to be diagonalizable ?"

I think there are multiple set of necessary and sufficient condition for a matrix to be diagonalizable. Am I correct?

+1

@Prashant. sir

It cannot be Diagonalizable since repeated eigenvalue, [ we know if distinct eigen vector then diagonalizable]

Repeated eigenvalues can have distinct eigen vectors so this statement is incorrect.

0
yeah you are right, repeated eigen value can have one or more number of independent vectors.
But here it says only eigenvectors are the multiples of [1 4]' . So only one independent vector.
hence geometric multiplicity = 1, and algebraic multiplicity = 2. So not diagonalizable .
0
If all eigenvalues are different, then all eigenvectors are linearly independent and all geometric and algebraic multiplicities are 1.

0

Say $\begin{bmatrix} 1\\ 4 \end{bmatrix}$ is multiple of $2$

that means eigen vectors are $2.\begin{bmatrix} 1\\ 4 \end{bmatrix}$=$\begin{bmatrix} 2\\ 8 \end{bmatrix}$

does it mean it has repeated eigen value?

0
Yes
+1 vote
I think this is the correct solution, here goes:

First of all, the eigen vector is a 2 x 1 matrix, suggesting that there are 2 eigen values. They're either distinct, or not.
If they're distinct, then the eigen vectors corresponding to them would be linearly independent, hence P would be diagonalizable.

But in the question, it is mentioned that the only eigen vectors it has are multiples of [1 4]. No pair of vectors from this set would be linearly dependent, hence P cannot be diagonalized, and the two eigen values are the same.
by (31 points)
0
I didn't get this.
+2

This might help

If eigen values are distinct then eigen vectors linearly independent and matrix can be diagonalized but if eigen values are repetitive and eigen vectors linearly dependent in such case matrix can not be diagonalized but if eigen vectors are independent for repeatitive and distinct eigen value  then matrix can be diagonalized.