in Others edited by
104 views
1 vote
1 vote
Let $A$ and $B$ be two invertible real matrices of order $n$. Show that $\det(x A+(1-x) B)=0$ has finitely many solutions for $x.$
in Others edited by
104 views

3 Comments

$\det (xA + (1-x)B)$ is nothing but a real polynomial in $x$ of degree at most $n.$ So, $\det (xA + (1-x)B) = 0$ is having at most $n$ roots or you can say, it is having finitely many solutions for $x.$

$xA + (1-x)B$ is nothing but a linear combination of Matrices $A$ and $B.$
1
1

@ankitgupta.1729 Sir if i say matrix A  is a Linear combination of columns of two other matrices B,C and columns of B,C are Linearly independent then columns of A are Linearly Dependent 

0
0
Say, matrix $B= \begin{bmatrix} 0 &1 \\ 1 &0 \end{bmatrix}$ and matrix $C= \begin{bmatrix} 0 &1 \\ -1 &0 \end{bmatrix}$

Now, Consider a matrix $A$ which is a linear combination of matrices $B$ and $C$ as:

$A = \frac{1}{3} \begin{bmatrix} 0 &1 \\ 1 &0 \end{bmatrix} + \frac{2}{3} \begin{bmatrix} 0 &1 \\ -1 &0 \end{bmatrix}$

$A= \begin{bmatrix} 0 &1 \\ \frac{-1}{3} &0 \end{bmatrix}$

Now, you can see set $\left \{ \begin{bmatrix} 0\\ \frac{-1}{3} \end{bmatrix}, \begin{bmatrix} 1 \\ 0 \end{bmatrix} \right \}$ is linearly independent.
1
1

Please log in or register to answer this question.

Related questions