First of all, am I being crazy in thinking that if $\lambda$ is an eigenvalue of $AB$, where $A$ and $B$ are both $N \times N$ matrices (not necessarily invertible), then $\lambda$ is also an eigenvalue of $BA$?

If it’s not true, then under what conditions is it true or not true?

If it is true, can anyone point me to a citation? I couldn’t find it in a quick perusal of Horn & Johnson. I have seen a couple proofs that the characteristic polynomial of $AB$ is equal to the characteristic polynomial of $BA$, but none with any citations.

A trivial proof would be OK, but a citation is better.

**Answer**

If $v$ is an eigenvector of $AB$ for some nonzero $\lambda$, then $Bv\ne0$

and $$\lambda Bv=B(ABv)=(BA)Bv,$$ so $Bv$ is an eigenvector for $BA$ with the same eigenvalue. If $0$ is an eigenvalue of $AB$ then $0=\det(AB)=\det(A)\det(B)=\det(BA)$ so $0$ is also an eigenvalue of $BA$.

More generally, Jacobson’s lemma in operator theory states that for any two bounded operators $A$ and $B$ acting on a Hilbert space $H$ (or more generally, for any two elements of a Banach algebra), the non-zero points of the spectrum of $AB$ coincide with those of the spectrum of $BA$.

**Attribution***Source : Link , Question Author : dantswain , Answer Author : Bob Pego*