Industrial manufacturing is subject to stringent testing protocols aimed at guaranteeing the quality of the final product. In some circumstances, though, these methods are not sensitive enough to detect flaws, leading to incorrect determinations of quality. Researchers at the Istituto Nazionale di Ricerca Metrologica (INRiM) in Torino, Italy have now shown that alternatives based on quantum entanglement can produce more accurate assessments of “good” and “failed” products. Importantly, the INRiM team’s system uses relatively simple equipment, increasing the chances that it could be used in industrial settings.
Some degree of error is inherent even in highly refined procedures such as quality-control tests. Spectroscopic measurements, for example, require a laser source and a detector that are both subject to fluctuations. For tests that need to use low levels of light, as is the case on samples that are highly photosensitive, the signal-to-noise ratio tends to be lower, meaning that it impacts more heavily on the accuracy of the results. In classical methods, the error probabilities depend on the Gaussian distribution of the fluctuations of the entire system – in this case, the laser plus the detector.
In principle, quantum technologies – particularly quantum metrology – offer a way of overcoming the limits of classical systems. Some such technologies are already in use within gravitational wave detectors, where they significantly boost the system’s ability to detect small signals in a noisy background.
Entanglement does the trick
In the latest work, researchers led by Marco Genovese of the INRiM employed a subset of these methods, known as quantum conformance tests (QCTs), to obtain lower error probabilities in the type of measurements used in industrial quality-control assessments. The team’s first step is to entangle photons generated via a process called spontaneous parametric down-conversion, in which pumping a barium borate crystal with a laser at 405 nm produces a special type of entanglement known as two-mode squeezed vacuum states. The entangled photons are then sent down different paths, with one photon passing through the sample being tested and the other following a reference path. The two paths are kept the same length by letting the reference photon pass through a non-absorbing glass. Finally, the photons impinge on different regions of a single CCD detector, which counts photons from each path in order to determine the photon number correlation of the entangled beams.
The team analysed the error probabilities for both classical and quantum correlated methods and concluded that QCT led to better results in all cases. Notably, the researchers demonstrated that this “quantum advantage” persists for a wide range of parameters, even with experimental imperfections. Although the accuracy of the QCT method dropped when they simulated the degree of signal loss expected in a realistic industrial setting, it remained 20% higher than classical methods.
Other use cases
According to Genovese, the experiment demonstrates how QCT can be realized with present technology, making practical applications possible in the near future. He adds that the method could also have other use cases, with one example being “quantum reading” –using entangled photons to read data from a classical memory with greater efficiency than is possible with classical light. “We will pursue the studies of new applications of quantum light to quantum metrology, both exploiting photon number entangled states (as in this case), but also studying other methods [such as] using the properties of colour centres in diamond or by applying new paradigms of quantum measurement (as weak measurements),” he tells Physics World.
Pepijn Pinkse, a researcher at the University of Twente in the Netherlands who was not involved in the work, says that the results are an interesting and novel interpretation of QCT. In his opinion, however, an improvement of 20% is not enough to make the method attractive for widespread use, especially as it requires longer measurement times. “Better results can be obtained by using 10 or even 100 times more light,” he explains.
For that reason, Pinkse thinks that the most likely applications for this technology will be in situations where the light intensity needs to be kept low, such as in tests of photosensitive systems such as biological tissues or pigments that can bleach. He also suggests that bigger gains in accuracy might be obtained by extracting data from correlations in other quantities, such as the polarization of the entangled photons.
The researchers describe their work in Science Advances.