I am looking for a proof for this lemma: Assume Φ is a real, symmetric and non-singular matrix of order T×T with non-negative elements; Let define B=(u1,...,uT) as a column matrix of the eigenvectors of Φ and ΓT=diag(γ1,...,γT) is a diagonal matrix that consists of the corresponding eigenvalues of Φ. Then Φ+ΓT is an invertible matrix.

**Answer**

This is wrong, with or without the hypothesis of non-negative entries (which was added to the question after I originally posted this answer). Take Φ=(√2/211√2/2) and B=(111−1), then ΓT=(√2/2+100√2/2−1) and Φ+ΓT=(√2+111√2−1) is non invertible.

**Attribution***Source : Link , Question Author : karo solat , Answer Author : Marc van Leeuwen*