search
Log In
1 vote
490 views
$\left. \begin{array} { l } { \text { If A is a skew-symmetric matrix of order n, then number of linearly } } \\ { \text { independent eigen vectors of } \left( \mathrm { A } + \mathrm { A } ^ { \mathrm { T } } \right) = } \end{array} \right.$

A) 0

B) 1

C) n -1

D) n

______________________________________________________________________

I know it can be solved using

#indpendent eigen vectors = n   - rank(matrix)

I want to now how the above formula derived !!
in Linear Algebra 490 views
0
answer is $0?$
0

given that A is a skew symmetric matrix

so we know that AT  = -A

therefore ,  (A+AT)   = (A - A) = 0

In skew symmetric matrix we know that diagonal are 0's 

So Eigen value = $\lambda$ =  0

0
@magma you are right about the eigen value , but you have to find eigen vectors ?

@Laxman as i mentioned answer is (n - 0) = n
0
the correct answer is $n$?

1 Answer

5 votes

Given that $A$ is skew-symmetric matrix of order $n.$,then $A^{T}$ is also be order $n.$

We know that for skew-symmetric $A^{T}=-A$

                                                    $\Rightarrow A^{T}+A=[ 0 ]$

Let's take order $n=3$

$A=\begin{bmatrix} 0 &1 &-2 \\ -1 &0 &3 \\ 2 &-3 &0 \end{bmatrix}$

$A^{T}=\begin{bmatrix} 0 &-1 &2 \\ 1 &0 &-3 \\ -2 &3 &0 \end{bmatrix}$

$A+A^{T}=\begin{bmatrix} 0 &1 &-2 \\ -1 &0 &3 \\ 2 &-3 &0 \end{bmatrix}+\begin{bmatrix} 0 &-1 &2 \\ 1 &0 &-3 \\ -2 &3 &0 \end{bmatrix}$

$A+A^{T}=\begin{bmatrix} 0 &0 &0 \\ 0 &0 &0 \\ 0 &0 &0 \end{bmatrix}$

Charactricstic equation for  matrix $A+A^{T}:$

$|(A+A^{T})-\lambda I|=0$       where $I=\begin{bmatrix} 1 &0 &0 \\ 0 &1 &0 \\ 0 &0 &1 \end{bmatrix}$  Identity matrix.

$\Rightarrow \begin{vmatrix} -\lambda & 0 &0 \\ 0 & -\lambda &0 \\ 0&0 & -\lambda \end{vmatrix}=0$   where $\lambda$ is the Eigen Value.

$\Rightarrow  -\lambda(\lambda^{2}-0)=0$

$\Rightarrow  -\lambda^{3}=0$

$\Rightarrow  \lambda^{3}=0$

So,Eigen values are $\lambda_{1}=0,\lambda_{2}=0,\lambda_{3}=0$

We know that $AX=\lambda X$

$\Rightarrow(AX-\lambda X)=[0]$

$\Rightarrow((A+A^{T})X-\lambda X)=[0]$     In this question $A=A+A^{T}$

$\Rightarrow((A+A^{T})-\lambda I)X=[0]$ ------->(1) 

Now,$(A+A^{T})-\lambda I=\begin{bmatrix} 0 &0 &0 \\ 0&0 &0 \\ 0 & 0 &0 \end{bmatrix}-0.\begin{bmatrix} 1 &0 &0 \\ 0&1 &0 \\ 0 & 0 &1 \end{bmatrix}$  where $\lambda=0$ and $I$ is the identity matrix.

$(A+A^{T})-\lambda I=\begin{bmatrix} 0 &0 &0 \\ 0&0 &0 \\ 0 & 0 &0 \end{bmatrix}-\begin{bmatrix} 0 &0 &0 \\ 0&0 &0 \\ 0 & 0 &0 \end{bmatrix}$  where $\lambda=0$ and $I$ is the identity matrix.

$(A+A^{T})-\lambda I=\begin{bmatrix} 0 &0 &0 \\ 0&0 &0 \\ 0 & 0 &0 \end{bmatrix}$

Now from the equation $(1),$

$((A+A^{T})-\lambda I)X=[ 0 ]$

$\Rightarrow\begin{bmatrix} 0 &0 &0 \\ 0&0 &0 \\ 0 & 0 &0 \end{bmatrix}.\begin{bmatrix} x\\y \\ z \end{bmatrix}=\begin{bmatrix} 0\\0 \\ 0 \end{bmatrix}$  ,where $X=\begin{bmatrix} 0\\0 \\ 0 \end{bmatrix}$ Eigen Vector

   Rank$=$Order( Dimension of the matrix)$-$Nullity

Rank of a matrix $A+A^{T} =0$ and number of unknowns(variables) are $3$.

So$,r=0,n=3$

clearly see $r<n$ This is the condition for Infinite many numbers of solutions.

we can assign $n-r=3-0=3$ linealrly independent values to the $3$ unknowns(variables).

let say $x=k_{1},y=k_{2},z=k_{3}$

So,The number of Linealrly independent Eigen Vector is $3$.


edited by
0
how that answers the question ?
0

Lakshman Patel RJIT

you're right good job

here they want only number of linearly independent eigen vectors, they didn't ask the number of solutions

n-r = 3-0  = 3  linearly independent eigen vectors from which we get infinite number of solutions

0

@Laxman great effort !!

You are answering the Question directly..  I knew that we can do this.

My Question ( or doubt ) is ... How to prove below statement ?

#indpendent eigen vectors = n   - rank(matrix)

Whatever you have done is completely correct .. But I wanted a proof for above statement.

0
For proof, you can find it out to any textbook, I know a procedure,if i find it somewhere i can tell you

Related questions

3 votes
1 answer
1
596 views
The number of linearly independent eigen vector for eigen value 1 $\begin{bmatrix} 1 & 3 & 2 \\ 0 & 4 & 2\\ 0 &-3 & -1 \end{bmatrix}$Matrix After Substituing eigen value 1 $\begin{bmatrix} 0 & 3 & 2 \\ 0 & 3 & 2\\ 0 &-3 & 0 \end{bmatrix}$ Rank of Matrix = 1 The number of linearly independent Vector should be equal to Matrix Rank = 1 ? The ans given is No Of Unknown - Rank ( 3-1 = 2)
asked Feb 5, 2017 in Linear Algebra yg92 596 views
1 vote
0 answers
3
1 vote
2 answers
4
251 views
Let $\lambda_1, \lambda_2, \lambda_3$ denote the eigenvalues of the matrix $ A = \begin{pmatrix} 1 & 0 & 0 \\ 0 & \cos t & \sin t \\ 0 & -\sin t & \cos t \end{pmatrix}$. If $\lambda_1+ \lambda_2+\lambda_3=\sqrt{2} +1$ then the set of possible values of $t, - \pi \leq t < \pi$, is Empty set $\{ \frac{\pi}{4} \}$ $\{ - \frac{\pi}{4}, \frac{\pi}{4} \}$ $\{ - \frac{\pi}{3}, \frac{\pi}{3} \}$
asked Jun 2, 2016 in Linear Algebra kvkumar 251 views
...