# Is a matrix multiplied with its transpose something special?

In my math lectures, we talked about the Gram-Determinant where a matrix times its transpose are multiplied together.

Is $A A^\mathrm T$ something special for any matrix $A$?

The main thing is presumably that $$AA^TAA^T$$ is symmetric. Indeed $$(AA^T)^T=(A^T)^TA^T=AA^T(AA^T)^T=(A^T)^TA^T=AA^T$$. For symmetric matrices one has the Spectral Theorem which says that we have a basis of eigenvectors and every eigenvalue is real.

Moreover if $$AA$$ is invertible, then $$AA^TAA^T$$ is also positive definite, since $$x^TAA^Tx=(A^Tx)^T(A^Tx)> 0x^TAA^Tx=(A^Tx)^T(A^Tx)> 0$$

Then we have: A matrix is positive definite if and only if it’s the Gram matrix of a linear independent set of vectors.

Last but not least if one is interested in how much the linear map represented by $$AA$$ changes the norm of a vector one can compute

$$\sqrt{\left}=\sqrt{\left}\sqrt{\left}=\sqrt{\left}$$

which simplifies for eigenvectors $$xx$$ to the eigenvalue $$\lambda\lambda$$ to

$$\sqrt{\left}=\sqrt \lambda\sqrt{\left},\sqrt{\left}=\sqrt \lambda\sqrt{\left},$$

The determinant is just the product of these eigenvalues.