in Linear Algebra recategorized by
713 views
7 votes
7 votes

The eigenvalues of the matrix $X = \begin{pmatrix} 2 & 1 & 1  \\ 1 & 2 & 1 \\ 1 & 1 & 2 \end{pmatrix}$ are

  1. $1,1,4$
  2. $1,4,4$
  3. $0,1,4$
  4. $0,4,4$
in Linear Algebra recategorized by
by
713 views

3 Answers

6 votes
6 votes
sum of eigen values =trace of matrix=6

product of eigen values=determinant of matrix=4

only option a satisfy above properties

Hence option a is the answer
2 votes
2 votes

For $A_{3\times 3}$ matrix  we, can write the characteristics equation

$$\mid A - \lambda I\mid = 0$$

$$\textbf{(OR)}$$

$$\lambda^{3}-\bigg(\Sigma \big(L\big)\bigg)\lambda^{2}  + \bigg(\Sigma \big(PC\big)\bigg) \lambda - \mid A \mid = 0$$

Where $\Sigma \big(L\big) =\text{Sum of leading diagonal elements (or) trace}$

and $\Sigma \big(PC\big) = \text{Sum of the leading diagonal cofactors}$

$\implies \lambda^{3} - 6\lambda^{2} + 9\lambda - 4 = 0\rightarrow(1)$

To find the roots$:$

Put the value $\lambda = 1,$ it satisfy the equation.So, $\lambda  = 1$ is the one of the root of the equation.

$\lambda^{2}(\lambda  - 1) - 5\lambda  (\lambda  - 1) + 4 (\lambda  - 1) = 0$

$\implies (\lambda-1) (\lambda^{2} - 5\lambda + 4) = 0$

$\implies (\lambda-1) (\lambda^{2} - 4\lambda - \lambda + 4) = 0$

$\implies (\lambda-1) \big[\lambda(\lambda - 4) - 1 (\lambda - 4)\big] = 0$

$\implies (\lambda-1) \big[(\lambda - 4) (\lambda - 1)\big] = 0$

$\implies (\lambda-1)(\lambda-1)(\lambda-4) = 0$

$\implies \lambda = 1,1,4$

So, the correct answer is $(A).$

________________________________________________________________

Important properties of Eigen values:

  1. Sum of all eigen values$=$Sum of leading diagonal(principle diagonal) elements$=$Trace of the matrix.
  2. Product of all Eigen values$=Det(A)= \mid A \mid$
  3. Any square diagonal(lower triangular or upper triangular) matrix eigen values are leading diagonal (principle diagonal)elements itself.

Example:$A=\begin{bmatrix} 1& 0& 0\\ 0&1 &0 \\ 0& 0& 1\end{bmatrix}$

Diagonal matrix

Eigenvalues are $1,1,1$

$B=\begin{bmatrix} 1& 9& 6\\ 0&1 &12 \\ 0& 0& 1\end{bmatrix}$

Upper triangular matrix

Eigenvalues are $1,1,1$

$C=\begin{bmatrix} 1& 0& 0\\ 8&1 &0 \\ 2& 3& 1\end{bmatrix}$

Lower triangular matrix

Eigenvalues are $1,1,1$

edited by
1 vote
1 vote

sum of eigen value = trace of matrix = 6

No need to even find determinant of the matrix.

Only option A matches.

Some important properties of eigen values

  • The determinant of {\displaystyle A} is the product of all its eigenvalues,

    {\displaystyle \det(A)=\prod _{i=1}^{n}\lambda _{i}=\lambda _{1}\lambda _{2}\cdots \lambda _{n}.}

  • The eigenvalues of the {\displaystyle k}th power of {\displaystyle A}; i.e., the eigenvalues of {\displaystyle A^{k}}, for any positive integer {\displaystyle k}, are {\displaystyle \lambda _{1}^{k},...,\lambda _{n}^{k}}.
  • The matrix {\displaystyle A} is invertible if and only if every eigenvalue is nonzero.
  • If {\displaystyle A} is invertible, then the eigenvalues of {\displaystyle A^{-1}} are {\displaystyle {\frac {1}{\lambda _{1}}},...,{\frac {1}{\lambda _{n}}}} and each eigenvalue's geometric multiplicity coincides. Moreover, since the characteristic polynomial of the inverse is the reciprocal polynomial of the original, the eigenvalues share the same algebraic multiplicity.
  • If {\displaystyle A} is equal to its conjugate transpose {\displaystyle A^{*}}, or equivalently if {\displaystyle A} is Hermitian, then every eigenvalue is real. The same is true of any symmetric real matrix.
  • If {\displaystyle A} is not only Hermitian but also positive-definite, positive-semidefinite, negative-definite, or negative-semidefinite, then every eigenvalue is positive, non-negative, negative, or non-positive, respectively.
  • If {\displaystyle A} is unitary, every eigenvalue has absolute value {\displaystyle |\lambda _{i}|=1}.
  • if {\displaystyle A} is a {\displaystyle n\times n} matrix and {\displaystyle \{\lambda _{1},\ldots ,\lambda _{k}\}} are its eigenvalues, then the eigenvalues of matrix {\displaystyle I+A} (where {\displaystyle I} is the identity matrix) are {\displaystyle \{\lambda _{1}+1,\ldots ,\lambda _{k}+1\}}. Moreover, if {\displaystyle \alpha \in \mathbb {C} }, the eigenvalues of {\displaystyle \alpha I+A} are {\displaystyle \{\lambda _{1}+\alpha ,\ldots ,\lambda _{k}+\alpha \}}. More generally, for a polynomial {\displaystyle P} the eigenvalues of matrix {\displaystyle P(A)} are {\displaystyle \{P(\lambda _{1}),\ldots ,P(\lambda _{k})\}}.

Source: https://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors#Additional_properties_of_eigenvalues

Related questions