Is a matrix multiplied with its transpose something special?

In my math lectures, we talked about the Gram-Determinant where a matrix times its transpose are multiplied together.

Is A A^\mathrm T something special for any matrix A?

Answer

The main thing is presumably that AA^T is symmetric. Indeed (AA^T)^T=(A^T)^TA^T=AA^T. For symmetric matrices one has the Spectral Theorem which says that we have a basis of eigenvectors and every eigenvalue is real.

Moreover if A is invertible, then AA^T is also positive definite, since x^TAA^Tx=(A^Tx)^T(A^Tx)> 0

Then we have: A matrix is positive definite if and only if it’s the Gram matrix of a linear independent set of vectors.

Last but not least if one is interested in how much the linear map represented by A changes the norm of a vector one can compute

\sqrt{\left<Ax,Ax\right>}=\sqrt{\left<A^TAx,x\right>}

which simplifies for eigenvectors x to the eigenvalue \lambda to

\sqrt{\left<Ax,Ax\right>}=\sqrt \lambda\sqrt{\left<x,x\right>},

The determinant is just the product of these eigenvalues.

Attribution
Source : Link , Question Author : Martin Ueding , Answer Author : J. W. Tanner

Leave a Comment