What is the Difference Between Deviation and Standard Deviation?

🆚 Go to Comparative Table 🆚

The main difference between deviation and standard deviation lies in their definitions and applications.

Deviation refers to the difference between a single data point and a fixed value, such as the mean. It is a measure of how far a single number is from the mean. For example, if a data set has values of 70, 62, 65, 72, 80, 70, 63, 72, 77, and 79, and the mean is 71, the deviations from the mean are calculated as the difference between each value and the mean.

Standard Deviation, on the other hand, is a measure of the dispersion of a cluster of data from the center. It is calculated as the square root of the variance, which is the average of the squared deviations from the mean. The standard deviation tells us how far, on average, the data points are from the mean. It is a more comprehensive measure of dispersion compared to deviation, as it takes into account the distances of all data points from the mean.

In summary, deviation is a measure of how far a single data point is from the mean, while standard deviation is a measure of the overall dispersion of the data points around the mean.

Comparative Table: Deviation vs Standard Deviation

The main difference between deviation and standard deviation lies in their definitions and how they are calculated. Here is a table highlighting the differences between the two:

Feature Deviation Standard Deviation
Definition Deviation refers to the difference between a given value and the mean of a dataset. It represents the distance of a data point from the mean. Standard deviation is a statistic that measures the dispersion of a dataset relative to its mean. It indicates the average distance of the data points from the mean.
Calculation Deviation is calculated by subtracting the mean from a given data point value. Standard deviation is calculated by first finding the variance (the average squared deviation from the mean), and then taking the square root of the variance. Standard deviation = $$\sqrt{\text{variance}}$$.
Unit Deviation is expressed in the same unit as the dataset. Standard deviation is expressed in the same unit as the dataset but is always positive or zero.
Use Deviation helps to understand the distance of a data point from the mean. If most data points have a low deviation, the dataset is concentrated around the mean, and vice versa. Standard deviation helps to understand the overall dispersion of the dataset. A higher standard deviation indicates that the data points are more spread out, while a lower standard deviation indicates that the data points are closer to the mean.

In summary, deviation represents the distance of a data point from the mean, while standard deviation measures the overall dispersion of the dataset relative to its mean.