Atomic clock

Atomic clocks are used to make accurate measurements of time. Current models lag less than a second every million years.

Current atomic clocks use counting small energy transitions in atoms of cesium or rubidium, for example.
Atomic clock is a device that counts the electron energy transitions of some chemical elements to measure the passage of time extremely accurately. A good atomic clock is capable of being delayed by less than a second every million years . However, today there are already atomic clocks that would not be delayed even a second even if they worked for 138 million years .

How does the atomic clock work?

Atomic clocks are large machines capable of detecting small energy oscillations in atoms resulting from quantum energy transitions, which are called hyperfine transitions. Like all clocks, atomic clocks measure the passage of time by counting the oscillations of a resonator.

In ancient clocks, for example, the number of complete swings of a pendulum was counted. In digital wristwatches, the oscillations produced by a piezoelectric quartz crystal, which produces a slight vibration when subjected to an electrical stimulus, are counted. In atomic clocks, in turn, it is common to count the number of hyperfine energy transitions from the ground state to the first excited state of cesium atoms. For every 9,192,631,770 transitions in this element, there is exactly 1 second , according to the International System of Units (SI).

The cesium atomic clock is only delayed by one second every two million years.*

In addition to atomic clocks that measure time from cesium, there are those that count hyperfine transitions for other elements, such as hydrogen and rubidium-87.


The great advantage of atomic clocks in relation to conventional clocks is that the number of hyperfine transitions of all cesium atoms or any element used by them is the same, with no differences in measurement related to the resonant source , as occurs naturally in quartz crystals . These, when manufactured, can present differences in their structure, even if very small.

Don’t stop now… There’s more after the publicity 😉

The only source of errors in measuring the time of atomic clocks is related to the counting of transitions, which is done by capturing very low intensity microwaves.

Who invented the atomic clock?

The first accurate atomic clock was developed in 1955 at the National Physics Laboratory , a major physics laboratory in the United Kingdom. The clock in question was used to create the definition of the second currently used by the SI. The first atomic clock was actually created by the American Willard Libby, in 1946, and was based on the transitions of cesium-133. It had a delay of 1 second in 300,000 years of operation. The first commercial atomic clock, sold to aviation and telecommunications companies, among others, was the Atomichron.

The Atomichron atomic clock is sold to several companies.**

Where is the atomic clock used?

Perhaps the most relevant application of atomic clocks is the calibration of GPS (Global Positioning System). GPS works by measuring the time difference between electromagnetic waves emitted to different points on Earth, thus calculating the position of the emitting source. The Earth’s gravity , however, causes small distortions in space-time, which are corrected with the use of simultaneous atomic clocks. Communication systems that use satellites, such as the internet, telephone and television, also benefit from the use of atomic clocks.

Currently, there are portable atomic clocks , which use the element rubidium and can be used in aviation systems, satellites or in commercial applications. They have relatively low cost, as well as their useful life. The most accurate atomic clocks, however, have large dimensions, are expensive and have a frightening precision: the clock used in the United States of America (NIST-F2) will not be delayed by a single second in the next 300 million years .

Related Articles

Leave a Reply

Your email address will not be published.

Back to top button