• Aucun résultat trouvé

Theoretical concept

4.1.1 Quantum entropy

If we want to define randomness, we would find many different approaches in the literature. For philosophers, randomness is a property of any event that happens by chance [87] and one can’t always generate it or measure it. In information theory, a sequence of bits is called random if its Kolmogorov complexity is maximal [88], this definition does not guarantee the unpredictability of the bit sequence, an essential criteria for many applications. In this chapter, we define the randomness as a property of a physical process whose the outcome is uniformly distributed and independent of all information available in advance [89]

and we show that the QRNGs described in the following sections output random data compliant to this definition.

P

+

--+

Photo-sensitive Material

Digital bits Analog

signal

1 0 Light source Lossy channel (η) Detector model

Figure 4.1: Modelling of a quantum random number generator based on photon detectors: a light source illuminates a photon detector through a lossy channel with a transmission probabilityη including all the optical coupling losses and the photon detection efficiency of the detector. The photon detector plays the role of the transducer and outputs digital bits correlated to the amount of detected photons.

A QRNG based on photon detectors can be modelled as a light source emitting photons according to a certain statistical time distribution and illuminating a pho-ton detector through a lossy channel with a transmission probability η including

all the optical coupling losses and the photon detection efficiency of the detector.

The photon detector plays the role of the transducer and outputs digital bits cor-related to the amount of detected photons. Figure 4.1 shows a demonstration of this model.

To quantify the randomness of a sequence of bits, we refer to the concept of entropy in information theory, first introduced by Shannon [90]. Entropy measures the uncertainty associated with a random variable and is expressed in bits. Depending on the type of light source and on the detection mechanism of the transducer the origin of quantum entropy varies and so does its amount. that is why a perfect modelling of the physical process is required and should take into account the system imperfections. In fact, in practical implementation of QRNG, the desired quantum process cannot be created or measured perfectly, there are always hard-ware imperfections and different noise sources that cannot be controlled and that can affect the raw output of the TRNG by introducing a bias, a pattern or some classical noise. All these side information should be taken into account for accurate quantum entropy quantification.

4.1.2 Entropy extraction

The hardware imperfections and noise sources mentioned above can dramatically reduce the amount of generated quantum entropy. Fortunately, it is still possible to obtain true randomness by applying an appropriate randomness extraction to the raw random bits generated by the device. Block-wise hashing functions are widely used and for our QRNGs we used a hash function based on vector-matrix multiplication (Toeplitz matrix) [4]. This extractor is applied to vectors of n raw bits and output shorter vectors of k true random bits. The extraction matrix elements (n x k) are usually constant and generated by an independent RNG.

Once n, the length of the raw string, is chosen, the parameter k will be bound by the probability that the output bit string deviates from perfectly random output bits . If the extraction function is taken from a two-universal family of hash functions, it is possible to quantify this failure probability by the Leftover Hash Lemma with side information:

= 2(h·nk)/2 (4.1)

Where h is the min-entropy/bit of the input vector of n raw bits. Since a value of =0 is generally unachievable, we try to keep below 2100 implying that even using millions of photon detectors during a time longer than the age of the universe, we won’t be able to see any deviation from perfect random sequence of bits.

4.1.3 Statistical tests

After efficient extraction, the final sequences of bits should be uniformly distributed and independent of all information available in advance. However, this statement is not sufficient for applications requiring high level of security such as cryptography.

The generated random numbers still need to be tested for particular weaknesses.

On the other side, if the theoretical model of the QRNG gives near-unity en-tropy/bit, the extraction may not be necessary and in that case another proof is needed to ”certify” the generated data for use and increase the level of trust in the generator.

Many standard batteries of statistical tests have been conceived to bring the solu-tion for this concern. These tests characterize the properties of a random sequence in terms of probabilities. Some of the tested criteria are the proportion of zeroes and ones in the entire sequence or within a block of bits, the frequency of sequences with identical bits, the presence of periodic and aperiodic patterns and the com-pressibility. The NIST tests suite (SP 800-90B) [91] is one of the most used set of tests to characterize physical RNGs, but we can also cite the ”Diehard” battery of tests [92] and the “dieharder” tests [93] that combines both; NIST and Diehard with some extra tests.

Note that there is no ”complete” set of tests for perfect randomness evaluation and that some of the sequences are expected to fail the statistical tests with a limited probability. Moreover, the results of these tests should be interpreted cautiously to avoid incorrect outcomes.

Documents relatifs