How to prove that eigenvectors from different eigenvalues are linearly independent

How can I prove that if I have $n$ eigenvectors from different eigenvalues, they are all linearly independent?

I’ll do it with two vectors. I’ll leave it to you do it in general.

Suppose $\mathbf{v}_1$ and $\mathbf{v}_2$ correspond to distinct eigenvalues $\lambda_1$ and $\lambda_2$, respectively.

Take a linear combination that is equal to $0$, $\alpha_1\mathbf{v}_1+\alpha_2\mathbf{v}_2 = \mathbf{0}$. We need to show that $\alpha_1=\alpha_2=0$.

Applying $T$ to both sides, we get

Now, instead, multiply the original equation by $\lambda_1$:

Now take the two equations,

and taking the difference, we get:

Since $\lambda_2-\lambda_1\neq 0$, and since $\mathbf{v}_2\neq\mathbf{0}$ (because $\mathbf{v}_2$ is an eigenvector), then $\alpha_2=0$. Using this on the original linear combination $\mathbf{0} = \alpha_1\mathbf{v}_1 + \alpha_2\mathbf{v}_2$, we conclude that $\alpha_1=0$ as well (since $\mathbf{v}_1\neq\mathbf{0}$).

So $\mathbf{v}_1$ and $\mathbf{v}_2$ are linearly independent.

Now try using induction on $n$ for the general case.