Explanation of Least count error

Least count error results due to the inadequacy of resolution of the instrument. We can understand this in the context of least count of a measuring device. The least count of a device is equal to the smallest division on the scale. Consider the meter scale that we use. What is its least count? Its smallest division is in millimeter (mm). Hence, its least count is 1 mm i.e. 10^{-3} m i.e. 0.001 m. Clearly, this meter scale can be used to measure length from 10^{-3} m to 1 m. It is worth to know that least count of a vernier scale is 10^{-} m and that of screw gauge and spherometer 10^{-5} m.

Returning to the meter scale, we have the dilemma of limiting ourselves to the exact measurement up to the precision of marking or should be limited to a step before. For example, let us read the measurement of a piece of a given rod. One end of the rod exactly matches with the zero of scale. Other end lies at the smallest markings at 0.477 m (= 47.7 cm = 477 mm). We may argue that measurement should be limited to the marking which can be definitely relied. If so, then we would report the length as 0.47 m, because we may not be definite about millimeter reading.

This is, however, unacceptable as we are sure that length consists of some additional length – only thing that we may err as the reading might be 0.476 m or 0.478 m instead of 0.477 m. There is a definite chance of error due to limitation in reading such small divisions. We would, however, be more precise and accurate by reporting measurement as 0.477 ± some agreed level of anticipated error. Generally, the accepted level of error in reading the smallest division is considered half the least count. Hence, the reading would be :

⇒ x = 0.477 ± 0.001/2 m

⇒ x = 0.477 ± 0.0005 m

If we report the measurement in centimeter,

⇒ x = 47.7 ± 0.05 cm

⇒ x = 47.7 ± 0.05 cm

If we report the measurement in millimeter,

⇒ x = 477 ± 0.5 mm