It is claimed that two caesium clocks if allowed to run for \(100\) years, free from any disturbance, may differ by only about \(0.02~\text s.\)What does this imply for the accuracy of the standard caesium clock in measuring a time interval of \(1~\text s\)?
1. \(6\times10^{-12}~\text s\)
2. \(6\times10^{-10}~\text s\)
3. \(6\times10^{-8}~\text s\)
4. \(6\times10^{-6}~\text s\)

100years=100×365×24×60×60=3.154×109s.
Given,differencebetweenthetwoclocksafter100years=0.02s
Timedifferencein1s=0.023.15×109=6.35×10-12s.
So,accuracyinmeasuringatimeintervalof1s=16.35×10-12
=1.57×1011=1011Approx.
Therefore,accuracyof1partin1011to1012