That’s my question, I have looking round online and people post a formula by they don’t explain the formula. Could anyone please give me a hand with that ? cheers

**Answer**

If the OLS regression contains a constant term, i.e. if in the regressor matrix there is a regressor of a series of ones, then the sum of residuals is *exactly* equal to zero, as a matter of algebra.

**For the simple regression,**

specify the regression model

yi=a+bxi+ui,i=1,...,n

Then the OLS estimator (ˆa,ˆb) minimizes the sum of squared residuals, i.e.

(ˆa,ˆb):n∑i=1(yi−ˆa−ˆbxi)2=min

For the OLS estimator to be the *argmin* of the objective function, it must be the case as a necessary condition, that the first partial derivatives with respect to a and b, evaluated at (ˆa,ˆb) equal zero. For our result, we need only consider the partial w.r.t. a:

∂∂an∑i=1(yi−a−bxi)2|(ˆa,ˆb)=0⇒−2n∑i=1(yi−ˆa−ˆbxi)=0

But yi−ˆa−ˆbxi=ˆui, i.e. is equal to the residual, so we have that

n∑i=1(yi−ˆa−ˆbxi)=n∑i=1ˆui=0

The above also implies that if the regression specification does *not* include a constant term, then the sum of residuals will not, in general, be zero.

**For the multiple regression,**

let X be the n×k matrix containing the regressors, ˆu the residual vector and y the dependent variable vector. Let M=In−X(X′X)−1X′ be the “residual-maker” matrix, called thus because we have

ˆu=My

It is easily verified that MX=0. Also M is idempotent and symmetric.

Now, let i be a column vector of ones. Then the sum of residuals is

n∑i=1ˆui=i′ˆu=i′My=i′M′y=(Mi)′y=0′y=0

So we need the regressor matrix to contain a series of ones, so that we get \mathbf M\mathbf i = \mathbf 0.

**Attribution***Source : Link , Question Author : Maximilian1988 , Answer Author : Alecos Papadopoulos*