# Matrices commute if and only if they share a common basis of eigenvectors?

I’ve come across a paper that mentions the fact that matrices commute if and only if they share a common basis of eigenvectors. Where can I find a proof of this statement?

Suppose that $A$ and $B$ are $n\times n$ matrices, with complex entries say, that commute.
Then we decompose $\mathbb C^n$ as a direct sum of eigenspaces of $A$, say
$\mathbb C^n = E_{\lambda_1} \oplus \cdots \oplus E_{\lambda_m}$, where $\lambda_1,\ldots, \lambda_m$ are the eigenvalues of $A$, and $E_{\lambda_i}$ is the eigenspace for $\lambda_i$.
(Here $m \leq n$, but some eigenspaces could be of dimension bigger than one, so we need not have $m = n$.)

Now one sees that since $B$ commutes with $A$, $B$ preserves each of the $E_{\lambda_i}$:
If $A v = \lambda_i v,$ then $A (B v) = (AB)v = (BA)v = B(Av) = B(\lambda_i v) = \lambda_i Bv.$

Now we consider $B$ restricted to each $E_{\lambda_i}$ separately, and decompose
each $E_{\lambda_i}$ into a sum of eigenspaces for $B$. Putting all these decompositions together, we get a decomposition of $\mathbb C^n$ into a direct sum of spaces, each of which is a simultaneous eigenspace for $A$ and $B$.

NB: I am cheating here, in that $A$ and $B$ may not be diagonalizable (and then the statement of your question is not literally true), but in this case, if you replace “eigenspace” by “generalized eigenspace”, the above argument goes through just as well.