How to prove that eigenvectors from different eigenvalues are linearly independent

I'll do it with two vectors. I'll leave it to you do it in general.

Suppose $\mathbf{v}_1$ and $\mathbf{v}_2$ correspond to distinct eigenvalues $\lambda_1$ and $\lambda_2$, respectively.

Take a linear combination that is equal to $0$, $\alpha_1\mathbf{v}_1+\alpha_2\mathbf{v}_2 = \mathbf{0}$. We need to show that $\alpha_1=\alpha_2=0$.

Applying $T$ to both sides, we get $$\mathbf{0} = T(\mathbf{0}) = T(\alpha_1\mathbf{v}_1+\alpha_2\mathbf{v}_2) = \alpha_1\lambda_1\mathbf{v}_1 + \alpha_2\lambda_2\mathbf{v}_2.$$ Now, instead, multiply the original equation by $\lambda_1$: $$\mathbf{0} = \lambda_1\alpha_1\mathbf{v}_1 + \lambda_1\alpha_2\mathbf{v}_2.$$ Now take the two equations, $$\begin{align*} \mathbf{0} &= \alpha_1\lambda_1\mathbf{v}_1 + \alpha_2\lambda_2\mathbf{v}_2\\ \mathbf{0} &= \alpha_1\lambda_1\mathbf{v}_1 + \alpha_2\lambda_1\mathbf{v}_2 \end{align*}$$ and taking the difference, we get: $$\mathbf{0} = 0\mathbf{v}_1 + \alpha_2(\lambda_2-\lambda_1)\mathbf{v}_2 = \alpha_2(\lambda_2-\lambda_1)\mathbf{v}_2.$$

Since $\lambda_2-\lambda_1\neq 0$, and since $\mathbf{v}_2\neq\mathbf{0}$ (because $\mathbf{v}_2$ is an eigenvector), then $\alpha_2=0$. Using this on the original linear combination $\mathbf{0} = \alpha_1\mathbf{v}_1 + \alpha_2\mathbf{v}_2$, we conclude that $\alpha_1=0$ as well (since $\mathbf{v}_1\neq\mathbf{0}$).

So $\mathbf{v}_1$ and $\mathbf{v}_2$ are linearly independent.

Now try using induction on $n$ for the general case.


Alternative:

Let $j$ be the maximal $j$ such that $v_1,\dots,v_j$ are independent. Then there exists $c_i$, $1\leq i\leq j$ so that $\sum_{i=1}^j c_iv_i=v_{j+1}$. But by applying $T$ we also have that

$$\sum_{i=1}^j c_i\lambda_iv_i=\lambda_{j+1}v_{j+1}=\lambda_{j+1}\sum_{i=1}^j c_i v_i$$ Hence $$\sum_{i=1}^j \left(\lambda_i-\lambda_{j+1}\right) c_iv_i=0$$ which is a contradiction since $\lambda_i\neq \lambda_{j+1}$ for $1\leq i\leq j$.

Hope that helps,


Hey I think there's a slick way to do this without induction. Suppose that $T$ is a linear transformation of a vector space $V$ and that $v_1,\ldots,v_n \in V$ are eigenvectors of $T$ with corresponding eigenvalues $\lambda_1,\ldots,\lambda_n \in F$ ($F$ the field of scalars). We want to show that, if $\sum_{i=1}^n c_i v_i = 0$, where the coefficients $c_i$ are in $F$, then necessarily each $c_i$ is zero.

For simplicity, I will just explain why $c_1 = 0$. Consider the polynomial $p_1(x) \in F[x]$ given as $p_1(x) = (x-\lambda_2) \cdots (x-\lambda_n)$. Note that the $x-\lambda_1$ term is "missing" here. Now, since each $v_i$ is an eigenvector of $T$, we have \begin{align*} p_1(T) v_i = p_1(\lambda_i) v_i && \text{ where} && p_1(\lambda_i) = \begin{cases} 0 & \text{ if } i \neq 1 \\ p_1(\lambda_1) \neq 0 & \text{ if } i = 1 \end{cases}. \end{align*}

Thus, applying $p_1(T)$ to the sum $\sum_{i=1}^n c_i v_i = 0$, we get $$ p_1(\lambda_1) c_1 v_1 = 0 $$ which implies $c_1 = 0$, since $p_1(\lambda_1) \neq 0$ and $v_1 \neq 0$.