by ryan

In preparation for the upcoming nano days event, Paul D. came by last week to give us some insight on how very very very small nano particles are and what they are doing in our pants. And although that surely is amazing to think about, for me, the most interesting part of the training was a diversion about how our standards of measurements came to be. Have you ever wondered why a meter isn’t a little bit longer or a foot a little bit shorter. Plus it begs the question how could you tell someone else about a unit of measurement precisely without using another (i.e. a yard is three feet).

Paul told us that the original measurement systems came about from human dimensions (palm, foot, digit, distance from arm to elbow). And while this was convenient, it wasn’t always consistent. Imagine if ancient Yao Ming and ancient Verne Troyer had to do some business with eachother.

Later on, other common objects were used. According to Paul D. (and wikipedia), in old english times an inch was the equivalent of 3 barleycorns.

It seems that in the eighteenth century, with the enlightenment in full effect, scientists began to work on better definitions of units of measurement. The yard was based on the the length of a pendulum that traveled back and forth in one second. But since gravity varies slightly across the earth, that could not be standardized completely. In 1791 the French proposed a meter as a standard that equals 1/10,000 of the distance from the equator to the north pole through Paris. There was a seven year expedition to get the number right from 1792-1799. And pretty amazingly, although I don’t know how they did it, the measurement was only a fifth of a millimeter off. Pretty soon after they made a standard bar that could be copied and distributed.

After our good friend Albert Einstein proved that everything was relative, the meter had to be related in terms of the speed of light. So as of 1983 the meter is defined as the length of the path travelled by light in vacuum during a time interval of 1/299 792 458 of a second. While that’s surely not as easily figured out as the distance between the fingertips and the elbow, it is much more precise.