104 views
Let $A$ and $B$ be two invertible real matrices of order $n$. Show that $\det(x A+(1-x) B)=0$ has finitely many solutions for $x.$

$\det (xA + (1-x)B)$ is nothing but a real polynomial in $x$ of degree at most $n.$ So, $\det (xA + (1-x)B) = 0$ is having at most $n$ roots or you can say, it is having finitely many solutions for $x.$

$xA + (1-x)B$ is nothing but a linear combination of Matrices $A$ and $B.$

@ankitgupta.1729 Sir if i say matrix A  is a Linear combination of columns of two other matrices B,C and columns of B,C are Linearly independent then columns of A are Linearly Dependent

Say, matrix $B= \begin{bmatrix} 0 &1 \\ 1 &0 \end{bmatrix}$ and matrix $C= \begin{bmatrix} 0 &1 \\ -1 &0 \end{bmatrix}$

Now, Consider a matrix $A$ which is a linear combination of matrices $B$ and $C$ as:

$A = \frac{1}{3} \begin{bmatrix} 0 &1 \\ 1 &0 \end{bmatrix} + \frac{2}{3} \begin{bmatrix} 0 &1 \\ -1 &0 \end{bmatrix}$

$A= \begin{bmatrix} 0 &1 \\ \frac{-1}{3} &0 \end{bmatrix}$

Now, you can see set $\left \{ \begin{bmatrix} 0\\ \frac{-1}{3} \end{bmatrix}, \begin{bmatrix} 1 \\ 0 \end{bmatrix} \right \}$ is linearly independent.