Once a year, three officials bearing three separate keys meet at the bottom of a stairwell at the International Bureau of Weights and Measures, in Sèvres, France. There they unlock a vault to check that a plum-size cylinder of platinum iridium alloy is exactly where it should be. Then they close the vault and leave the cylinder to sit alone, under three concentric bell jars, as it has for most of the past 125 years.
This lonely cylinder is the International Prototype of the Kilogram, known colloquially as Le Grand K, and it is the last remaining physical object to define a unit of measure. It’s a quaint throwback to a time when people compared the ocean’s depth to the span of a man’s outstretched arms and the second to a tiny fraction of a year. Now we fix our rulers to the speed of light and our clocks to a spectral property of cesium. By thus linking measurement to fundamental and unchanging phenomena, scientists have paved the way for GPS satellites, gravity-wave detectors, and many other precision technologies that simply wouldn’t have been possible before.
The trouble posed by the master kilogram is apparent in the many friction-filled steps by which it calibrates other masses. Once every few decades, a scientist plucks the cylinder from its perch with chamois-leather-padded pincers, rubs its surface with a cloth soaked in alcohol and ether, and steam-cleans it. Then he puts the prototype in a precise balance that compares it to the bureau’s official copies, which are in turn compared to copies kept by member countries. And thus the prototype’s mass trickles down to set the standard for the rest of the world.
The system has been far from seamless. When the cylinder was last removed from the vault in 1988, the bureau’s metrologists were disappointed to discover that its mass and those of its official copies had drifted apart by as much as 70 micrograms since 1889. That discrepancy is tiny—comparable to the mass of a small grain of sugar—but it confirmed a troubling instability. All that metrologists can say is that the master kilogram seems to have lost as much as 50 µg over the course of a century relative to its siblings. But the actual drift could be up or down, and it might even be a lot more than 50 µg, because the prototype and its metallurgically identical copies could all be changing as an ensemble.
“It’s a bit ridiculous in this day and age, because it’s not just the mass that depends on the prototype. It’s all energy, all force, all units that are linked in any way to the kilogram,” retired metrologist Terry Quinn explains one gray afternoon at the bureau, about a week before an international committee was set to convene to decide the kilogram’s fate.
A former director of the bureau (often referred to by its French acronym, BIPM), Quinn has been campaigning since the early 1990s to peg the kilogram to an unchanging aspect of nature. Such a change would be a boon to scientists who depend on stable units to perform long-term measurements, Quinn says. It could also have a big impact on electrical engineering, particularly for makers of precision multimeters and other basic tools.
When he began lobbying, Quinn says he thought it would take just a few years to come up with a better standard. But the effort to develop a precise way to measure mass with respect to a constant of nature has turned out to be phenomenally difficult.
The October meeting marked a big turning point for the kilogram. Delegates from the bureau’s then 55 member countries unanimously agreed on a tentative plan to base the kilogram on a fundamental constant of quantum mechanics. Three other core units—the ampere, the mole, and the kelvin—will likely change at the same time.
This coup is largely the result, after decades of work, of steady strides in two challenging strategies for measuring mass. One approach attempts to pin down the exact electromagnetic force needed to balance the gravitational tug on an object. The other harnesses Cold War–era uranium enrichment technology and a host of experimental techniques to count the number of atoms in extremely round balls of ultrapristine silicon.
For years, the two approaches have produced starkly conflicting results. But over the past few months, metrologists have been excited to find glimmers of convergence, and the effort to pin down mass once and for all is beginning to pick up steam.
The scientists and intellectuals of the 18th century dreamed of a system of units based on something more fundamental and precise than the span of a hand or the mass of a seed. This notion solidified toward the dawn of the French Revolution, when Enlightenment thinkers shaped the beginnings of the modern metric system. They defined the meter as a small fraction of the circumference of the Earth. The kilogram they in turn derived from the meter, as the mass of a cubic decimeter of distilled water.
Linking units to the natural world was a lofty goal, but in the end, practicality demanded physical objects that could serve as references. The meter was a metal rod until the standard was retired in 1960, when the new measurement system—the International System of Units, or SI—pegged the meter to the wavelength of light spit out by atoms of krypton-86. The meter has since been redefined once again as the distance light moves through a vacuum in slightly more than three nanoseconds.
The kilogram’s resistance to redefinition can in part be chalked up to politics. Getting an international consensus on a change to the SI takes time; one metrologist likened the process to steering a supertanker. But the big part of the delay comes from the sheer difficulty of measuring mass precisely with respect to unchanging standards of nature.
The challenge is most clearly seen in the effort to determine Planck’s constant, the quantum-mechanical quantity that’s set to become the new kilogram standard. Planck’s constant governs the smallest amount of energy allowed by the laws of physics, and it appears in basic equations like E=hν, which relates the energy of a photon to the frequency of light. Because Planck’s constant is linked to energy, it can also be linked to mass through Einstein’s famous equivalence, E=mc2.
The tool of choice for measuring Planck’s constant is the watt balance. Rather than weigh one mass against another, such a balance weighs an object against the amount of electromagnetic force needed to keep the object in place. The electrical quantities measured by the balance can be linked back to Planck’s constant using instruments that make precise quantum-electrical measurements.
A number of national labs are working on watt balances, but the largest one occupies a two-story, copper-foil-lined room in a remote building on the National Institute of Standards and Technology’s sprawling, deer-studded campus northwest of Washington, D.C. NIST’s project began in 1980, and the watt balance has been tweaked and overhauled so much that it’s a matter of debate whether the instrument should be considered the third or fourth generation of the experiment. “There’s a lot of Frankenstein here,” says Jon Pratt, who took over management of the balance last year.
A mechanical engineer and former punk rocker with a soul patch, Pratt is a newcomer to the kilogram definition effort. He’s been given the job of revamping and systematically studying the balance to see if it can be used to replicate the experiment’s 2007 result, performed by Richard Steiner and others at NIST, which still stands as the most precise watt-balance measurement made to date.
During a December 2011 visit, I got a rare glimpse of the upstairs section of the balance. Ordinarily, the entire apparatus is housed in vacuum to eliminate error due to the buoyancy force of air, which can easily fluctuate with the temperature and pressure of the room. But the balance’s vacuum chamber had been opened to allow technicians to work on a repair. The interior was a mess of wires, rods, and what looked like cantilevers. All the joints had been machined to be as loose as possible so that motions induced by gravity will always be oriented exactly downward. This became clear when, at one point, a guide accidentally nudged a component and set a few nearby parts jiggling as if they were Jello.
NIST’s watt balance is the largest ever built, because its designers opted to build the magnet out of superconducting coil. But like all the world’s watt balances, it works according to the principle outlined in 1975 by Bryan Kibble, then a physicist at the National Physical Laboratory in Teddington, England, who had been working on electrical balances in order to make precision measurements of the ampere. A mass is placed on one part of the balance and a coil of conducting wire immersed in a magnetic field is used to balance the object’s downward force. The experiment operates in two modes: In weighing mode, the coil remains static, and the machine is used to measure the current that must be run through the coil to enable it to maintain its position against the weight of the object. In the other mode, the current is switched off, and the coil is made to move up and down in the magnetic field, inducing a voltage across the coil.
Ideally, these two measurements can be combined to create a simple relation between mass, current, velocity, and voltage that does not depend in any way on the physical dimensions of the coil, which are difficult to pin down. The voltage and current in the coil can be measured with great precision by using calibration instruments that exploit the Josephson effect and the quantum Hall effect, quantum-mechanical effects that both depend on Planck’s constant.
At the moment, watt balances like NIST’s use objects of known mass, calibrated against the international prototype, to measure Planck’s constant. In the future it’ll be the other way around—the value of the Planck constant will be fixed, and watt balances will use this number to measure the mass of objects placed upon them.
That is, of course, if the physicists and engineers working on the experiments can make them accurate enough. The NIST group’s 2007 measurement of the Planck constant pegged its value with an estimated precision of 36 parts per billion. In isolation, it was an impressive result that put the experimental uncertainty close to a target of 20 parts per billion. (The international prototype, however much it might be drifting, can be compared to other objects with about 10 times that precision.)
But results from another watt balance, now housed at a laboratory at the National Research Council of Canada, in Ottawa, tell a different tale. Last year, the Canadian team reported its own precise measurement of the Planck constant, which differs from the NIST group’s by some 250 parts per billion. When the two conflicting results are plotted on the same graph, their error bars don’t overlap.
Clearly, both experiments can’t be right. But in some ways this discrepancy isn’t all that surprising. Today’s best watt balances are extraordinarily sensitive: Metrologists have found they can pick up earthquakes, magnetic interference from passing trains, the gravitational tug exerted by nearby snow cover, and even subtle shifts in ground level created when wind passes through nearby trees. Many of these sources of error are random and transient and can be eliminated by simply running the experiment longer. But there are still dozens of other quantities that must be carefully measured in order to get the right result. “The error budget is large,” says Barry Wood, who manages the Canadian watt balance. “It’s over 50 items that have to be calculated and assessed, and each of those is an experiment in itself.”
Even the most careful work can overlook large sources of error. Before arriving in Ottawa in 2009, Canada’s watt balance was built and run for nearly two decades across the Atlantic at the watt balance’s birthplace—the National Physical Laboratory. Weeks before the instrument was set to be boxed up at the British lab, physicist Ian Robinson was performing last checks on the experiment when he discovered an overlooked issue: The angle of the balance beam changed when a mass was added, in part because some small metal joints, used to align the coil, had been compressing. The hard-to-detect change could offset measurements of Planck’s constant by a significant amount. “That was one of those moments where the penny dropped, and it was not very pleasant,” Robinson says. “But I was very, very glad that I thought of it.”
The Canadian team has been working on a modification to fix the problem and is now partnering with the American team to resolve the discrepancy between the two results. But progress is slow. In February, representatives from both teams met at NIST to collaborate on just one of the corrections that must be made to the measurements: the exact value of gravitational force at the exact point where the mass sits in its pan. In the case of NIST’s tall watt balance, which has a pan that sits a good 5 meters above ground level, the necessary correction can be dozens of times as big as the targeted precision of the machine. To resolve the two experiments, the watt-balance teams will likely have to run through dozens of other potential sources of error. All eyes are now on NIST, which is hoping to get a new result—the first in five years—later this year. Waiting in the wings are other watt-balance projects at the BIPM and national labs in China, France, New Zealand, and Switzerland.
Collaboration is new for the two leading watt-balance teams, which for years have had to operate on opposite sites of the Atlantic. But it’s been a vital part of another effort to measure mass—by linking it not to Planck’s constant but to the mass of an atom.
The effort centers around the Avogadro constant, a quantity often used in chemistry to link the masses of macroscopic objects to the mass of their molecular or atomic constituents. Avogadro’s constant is currently defined as the number of atoms in a mole of a substance, and its value is identical to the number of carbon-12 atoms needed to make up exactly 12 grams. But Avogadro’s constant can just as easily be used to work from any sort of atom all the way up to the kilogram.
Silicon is a natural candidate for such measurements. Thanks to the semiconductor industry, we can grow silicon crystals that are pretty much structurally perfect, making them ideal for measurements of basic crystal properties. But beating the uncertainty in Avogadro’s constant down to a level that could enable it to replace the international prototype is a different story.
Natural silicon isn’t pure. Although its dominant isotope is silicon-28, nearly 8 percent of it consists of the heavier isotopes silicon-29 and silicon-30. That slight contamination gives rise to a big uncertainty in silicon’s molar mass. In the early 2000s, Peter Becker, who was spearheading efforts to measure Avogadro’s constant at Germany’s metrological institute, the Physikalisch-Technische Bundesanstalt, realized that natural silicon wouldn’t work. But fortune intervened. A colleague from what was once East Germany suggested the team use a Russian uranium enrichment facility to purify some material. Becker raised €2 million to get the material. After about half a year and the efforts of roughly 250 gas centrifuges, the international team that had pitched in money was presented with 6 kg of 99.995 percent silicon-28.
After a crystal was grown from that stock, it was cut into two spheres, which were polished to such extreme roundness that repeated measurements of a single parameter—the sphere’s diameter—could be used to determine its volume. The spheres have since been toted from lab to lab in a transparent acrylic suitcase, always carried by hand onto airplanes because the objects are too precious to ship. Over the course of two years, an international team worked to pin down every property of the spheres. A reflective cavity was used to establish the diameter, and thus the volume, of each sphere. Teams at the BIPM and in Germany and Japan measured its mass against official kilogram copies. In Italy, a team used X-rays to work out the exact spacing between atoms in the crystal. In Switzerland, researchers characterized the chemical composition of the thin layer of quartz-like oxide that had grown around them. A team in Germany developed a novel dilution technique to work out the exact isotopic composition, which determines the molar mass of the spheres.
Last year, the Avogadro team published their first results, which pinpointed Avogadro’s constant to a precision of 30 parts per billion. The precision was a fabulous improvement over the old results, but when it was converted to Planck’s constant, the measurement was disappointingly off—it didn’t agree with the American watt-balance measurements and had only a slight overlap with the Canadian watt-balance results. Since then, two teams of chemists in the United States and Canada have joined the effort to measure the molar mass of the silicon crystal. The measurement is difficult—the natural silicon in laboratory glassware or even in the dust in the air can throw off measurements, says Greg Turk, who is leading the effort at NIST. But early results from the Canadian team have pushed things closer to the results of the Canadian watt balance, suggesting that a convergence might just be under way.
The international committee in charge of mass has set some exacting standards for the kilogram experiments. At least two experiments with the watt balance, together with the Avogadro project, must demonstrate an accuracy equivalent to 50 parts per billion. One experiment must demonstrate an accuracy of at least 20 parts per billion. All the results must agree with one another.
So far, the NIST and Avogadro experiments meet the first requirement, but the second requirement is still out of reach, and there are unexplained differences between existing results. Some express bafflement over the stringency of the requirements. “Nobody talks about this, but the uncertainty in uncertainty is huge,” says NIST’s Pratt. “Sweating over 20 parts per billion or 36 parts per billion seems a bit pedantic.”
The requirements are particularly nonsensical, says Quinn, given the fact that the real uncertainty in the mass of the international prototype is ignored. Despite its known drift with respect to its copies, the uncertainty in the mass of the kilogram is, by definition, zero. “The point I take and I emphasize is that our knowledge of the absolute mass of the kilogram is so poor that almost anything else is better,” Quinn says. “I think that we could do [the redefinition] today and we would be better off.”
One immediate benefit could be to electrical measurements. In the SI, the ampere is still impractically defined by the force between infinitely long conductors. Since 1990, those in need of precise voltage and resistance measurements have used a separate system of units, which employ what are now somewhat outdated values for several fundamental constants. Pegging the kilogram to Planck’s constant—and the ampere to fundamental charge—will bring electrical units back into the fold.
The kilogram hunters are now eyeing 2014—the next General Conference on Weights and Measures—as the year when all their hard work will finally pay off. But it’s far from guaranteed that the experiments will agree in time to meet such a deadline. Nonetheless, metrologists are getting ready.
Sometime soon, the keepers of the kilogram will take the cylinder out of its enclosure for the first time in more than 20 years. Once again, they will measure it against its copies, in the hope of establishing as close a link as possible between the present-day mass of the cylinder and the results of the experiments that would replace it. The retiring of the prototype will be slow and careful—which is only fitting for an object that has for so long supported the weight of the world.
This article originally appeared in print as “Consider the Kilogram.”