in Linear Algebra
11,241 views
48 votes
48 votes
The value of the dot product of the eigenvectors corresponding to any pair of different eigenvalues of a $4-by-4$ symmetric positive definite matrix is ___________
in Linear Algebra
11.2k views

4 Comments

I still don't get what (Ax, y) - (x, Ay) means however.

Here parentheses$(“()”)$ are used for an inner product between two vectors. $Ax$ and $y$ both are vectors. Similarly $x$  and $Ay$ are vectors and so inner product is defined for them or you can say that dot product is taken between two vectors (inner product is a more general term and can be used for functions). This inner product can also be represented by angle brackets  $(“<>”)$

In the link given by srestha,

$(x,y)=\Sigma_i x_iy_i = x^Ty$(matrix multiplication)

$(Ax,y) – (x,Ay) =  (Ax)^Ty \;– x^TAy = (\lambda_1x)^Ty \;– x^T\lambda_2y = \lambda_1 x^Ty – \lambda_2 x^Ty = (\lambda_1 – \lambda_2)x^Ty = (\lambda_1 – \lambda_2) (x,y) $

Now, $(Ax,y) = (Ax)^Ty = x^TA^Ty = x^TAy= (x,Ay)$. So, left hand side is zero and so, $(x,y)=0$ because $\lambda_1 \neq \lambda_2$

0
0

Note: A matrix is positive definite if it’s symmetric and all its eigenvalues are positive (Source).

2
2
edited by

For any Symmetric matrix:

Eigen Vectors are mutually Orthogonal.

http://sites.science.oregonstate.edu/math/home/programs/undergrad/CalculusQuestStudyGuides/vcalc/eigen/eigen.html

**** Positive definite Matrix is a Special type of Symmetric Matrix

4
4

4 Answers

124 votes
124 votes
Best answer

Let $\lambda_{1}$ and $\lambda_{2}$ be two distinct eigenvalues of matrix $A$ and $u$ and $v$ be their corresponding eigenvectors respectively.

We know that an eigenvector $X$ corresponding to an eigenvalue $\lambda$ of matrix $A$ satisfies 

$AX = \lambda X$

$\therefore Au = \lambda _{1}u \quad \to (1)$ and $Av = \lambda _{2}v \quad\to (2)$

On pre-multiplying eqn $(1)$ with $v^{T}$, we get

$v^{T}Au = v^{T}\lambda _{1}u$

$\left ( v^{T}A\right )u = v^{T}\lambda _{1}u$

$\left ( A^{T}v\right )^{T}u = v^{T}\lambda _{1}u$

 $\left ( Av\right )^{T}u = v^{T}\lambda _{1}u$  $($since $A$ is a symmetric matrix, we can write $A^{T} = A)$

But $Av = \lambda _{2}v$ ... from $(2)$

$\left ( \lambda _{2}v\right )^{T}u = v^{T}\lambda _{1}u$

$\lambda _{2}v^{T}u = v^{T}\lambda _{1}u$

$\lambda _{2}v^{T}u = \lambda _{1}v^{T}u$ $($as $\lambda _{1}$ is a constant, we can write $v^{T}\lambda _{1} = \lambda _{1}v^{T})$

$\lambda _{2}v^{T}u - \lambda _{1}v^{T}u = 0$

$\left ( \lambda _{2} - \lambda _{1} \right )v^{T}u = 0$

$\therefore$ Either $\left ( \lambda _{2} - \lambda _{1} \right ) = 0$ or $v^{T}u = 0$

But since $\lambda _{2} \neq \lambda _{1}$,  $v^{T}u$ must be $0$

$v^{T}u$ is nothing but the dot product of the eigenvectors $u$ and $v$

Hence, we can conclude that the eigenvectors corresponding to distinct eigenvalues of a real symmetric matrix are orthogonal.

selected by

4 Comments

very very very good explanation. Thank you so much.
0
0
edited by

Thanks @. I have also found similar proof.

Suppose $x$ and $y$ are two eigen vectors of A with $ \lambda_1 $ and $ \lambda_2 $ corresponding eigenvalues ( $\lambda_1 \neq  \lambda_2$) $$Ax  = \lambda_1 x\\ Ay  = \lambda_2 y$$

  $\lambda$’s are scalars

 $\therefore \lambda^T = \lambda$ and $\lambda^T x = \lambda x = x \lambda  = x\lambda ^T$ (pre or post multiply is same for scalar)

$\color{green}{x^T \lambda_1y} = (\lambda_1 x)^Ty = (Ax)^Ty =   \underbrace{x^TA^Ty = x^T A y}_\text{symmetric matrix} = \color{blue}{x^T \lambda_2y}$ 

$\Rightarrow \quad \, \, \, \, \,\color{green}{x^T \lambda_1y}  = \color{blue}{x^T \lambda_2y}  \\  \Rightarrow \underbrace{\lambda_1x^Ty  = \lambda_2x^Ty}_\text{$\lambda$ is scalar we can write it anywhere }   \Rightarrow (\lambda_1- \lambda_2)x^Ty = 0$

here $\lambda_1- \lambda_2 \neq 0$ as  $\lambda_1 \neq \lambda_2$ therefore $x^Ty = 0$ i.e. $x\perp y$. Hence Proved

this proof is very important if you go for IIT/IISc interviews after GATE. (Interview for Machine learning or similar research course) but this is not imp for GATE :)

Reference – https://math.stackexchange.com/a/82471

8
8

Page 332 introduction to linear algebra GilbertS.

4
4
52 votes
52 votes
Answer to this question is ZERO.

This is because eigen vectors corresponding to DIFFERENT eigen values of a REAL symmetric matrix are ORTHOGONAL to each other.

However, same eigen values they may not be.

And Dot -product of orthogonal vectors(perpendicular vectors ) is 0 (ZERO)

For more info see the link: http://math.stackexchange.com/questions/82467/eigenvectors-of-real-symmetric-matrices-are-orthogonal

2 Comments

Can someone please explain it in easy way  .. I am not getting  the explanation  of stackexchange .

Thank you
5
5
eigen vectors belonging to distinct eigenvalues of symmetric positive matrix are orthogonal. This because symmetric positive matrices are hermitian matrices and therefore normal.
0
0
3 votes
3 votes

1 comment

How the normalisation has taken place?
0
0
–5 votes
–5 votes
1

 

eigon vectors are orthogonal .
Answer:

Related questions