If the speed of light is a universal constant, why is it that the most accurate clocks use radioactive decay as a reference, and not light passing through a vacuum?
Answered by: Steve Baker,Blogger at LetsRunWithIt.com (2013-present)
The specific reason why not is that the international standard for a “second” is : “9,192,631,770 cycles of the radiation produced by the transition between two levels of the cesium 133 atom.”
So a device that correctly counts that number of cycles of that transition is BY DEFINITION, a perfectly, 100% accurate clock.
If instead, you were to build a box of some known length with (say) a laser beam at one end, a mirror at the far end and measure the time it takes the beam to travel the length of the box and return - then you could make a clock by dividing the distance traveled by the speed speed of light.
But to do that, you’d need to know the length of the box to 100% precision - and the speed of light to 100% precision. That’s going to be impossible because any real box will change dimensions depending on temperature, external air pressure - maybe also curve due to gravity, bend due to tidal forces from the moon, change length fractionally due to magnetic fields, etc, etc. You’d also need to know that the mirror was at some precise angle to the light beam.
There would be all sorts of possible sources of error in your clock… where the “atomic clock” is correct by definition.
It is tempting to change the definition of “one second” to be the time it takes for a photon to travel 299,792,458 meters… which would make it be possible for your clock to be accurate “by definition”. But this would introduce another problem. It would tie the definition of a “second” to the definition of a “meter”.
But a “meter” is currently defined as the distance traveled by light in 1/299,792,458 seconds.
So what you’d have would be a “circular definition” - the length of a meter depends on the duration of a second - which would depend on the length of a meter!
Hence we define a second in terms of something that can just be counted… it depends on no other definitions. The definition of a meter depends only on the definition of a second. Life is good!
The standardization folks are currently in the process of changing the definition of a “kilogram” to be the mass of some specific number of atoms, which would reduce that definition to another thing that can just be counted.
So what we’d end up with would be standards that can (at least in theory) be 100% perfectly accurate.