Can someone explain why taking an average of an average usually results in a wrong answer? Is there ever a case where the average of the average can be used correctly?

As an example, let’s say that an assessment is given to three schools and I want to find out the average score for all three schools combined and the average score per school. When I attempt to add the three individual scores and divide by three I get a number that is very close (+/- 1 percent) to the actual overall average.

**Answer**

If there are n1, n2, and n3 students in the three schools, and the average test score for each school is a1,a2,a3, respectively, the correct average is a “weighted average:”

n1n1+n2+n3a1+n2n1+n2+n3a2+n3n1+n2+n3a3

The average of the averages is:

13a1+13a2+13a3

These two values will be exactly the same if each school has exactly the same number of students, and will tend to be “close” if the schools are relatively close in size and/or the scores for the three schools are close.

If a school system created a small school consisting of all the smartest students, they could bump up the second value – the “average of averages” – but they couldn’t do that if they take the correct weighted average.

**Attribution***Source : Link , Question Author : O.O , Answer Author : Thomas Andrews*