Are all eigenvectors, of any matrix, always orthogonal?

I have a very simple question that can be stated without proof. Are all eigenvectors, of any matrix, always orthogonal? I am trying to understand Principal components and it is cruucial for me to see the basis of eigenvectors.

Answer

In general, for any matrix, the eigenvectors are NOT always orthogonal. But for a special type of matrix, symmetric matrix, the eigenvalues are always real and the corresponding eigenvectors are always orthogonal.

For any matrix M with n rows and m columns, M multiplies with its transpose, either M*M’ or M’M, results in a symmetric matrix, so for this symmetric matrix, the eigenvectors are always orthogonal.

In the application of PCA, a dataset of n samples with m features is usually represented in a n* m matrix D. The variance and covariance among those m features can be represented by a m*m matrix D’*D, which is symmetric (numbers on the diagonal represent the variance of each single feature, and the number on row i column j represents the covariance between feature i and j). The PCA is applied on this symmetric matrix, so the eigenvectors are guaranteed to be orthogonal.

Attribution
Source : Link , Question Author : Bober02 , Answer Author : Yue Tyler Jin

Leave a Comment