I’ve come across a paper that mentions the fact that matrices commute if and only if they share a common basis of eigenvectors. Where can I find a proof of this statement?
Suppose that A and B are n×n matrices, with complex entries say, that commute.
Then we decompose Cn as a direct sum of eigenspaces of A, say
Cn=Eλ1⊕⋯⊕Eλm, where λ1,…,λm are the eigenvalues of A, and Eλi is the eigenspace for λi.
(Here m≤n, but some eigenspaces could be of dimension bigger than one, so we need not have m=n.)
Now one sees that since B commutes with A, B preserves each of the Eλi:
If Av=λiv, then A(Bv)=(AB)v=(BA)v=B(Av)=B(λiv)=λiBv.
Now we consider B restricted to each Eλi separately, and decompose
each Eλi into a sum of eigenspaces for B. Putting all these decompositions together, we get a decomposition of Cn into a direct sum of spaces, each of which is a simultaneous eigenspace for A and B.
NB: I am cheating here, in that A and B may not be diagonalizable (and then the statement of your question is not literally true), but in this case, if you replace “eigenspace” by “generalized eigenspace”, the above argument goes through just as well.