Many people think a standard deviation indicates the “standard” amount that individual numbers deviate from the group’s mean. Specifically, they think an SD is computed as the average (i.e., arithmetic mean) of the deviation scores, disregarding whether the original scores are above or below the mean. Not so. For most groups of numbers, the SD is about 1.25 times as large as the “average deviation from the mean.”
Consider, for example, this population of 10 scores: 1, 2, 3, 4, 5, 5, 6, 7, 8, and 9. Disregarding sign, the average deviation from the mean = 2.00. However, the SD = approximately 2.45. The SD is larger because it gives greater weight to scores that lie farther away from the mean. It does this by squaring the deviations. The SD is computed as the “root-mean-squared-deviation,” with these 4 words explaining, in reverse order, what you must do to calculate the SD: (1) figure out how far each original score deviates from the mean, (2) square each of these deviation scores, (3) take the mean of the squared deviations, (4) compute the square root of the result arrived at in Step 3.
For more information about the standard deviation, go to http://en.wikipedia.org/wiki/Standard_deviation.