[OpenWrt-Devel] [PATCH RFC 0/5] ath79: add micro non-physical true RNG based on timing jitter

Stephan Müller smueller at chronox.de
Sat May 25 15:35:31 EDT 2019


Am Montag, 20. Mai 2019, 18:13:20 CEST schrieb Petr Štetiar:

Hi Petr,

A system called TNT BOM BOM sent me test results for Qubes. I am not sure who 
to reply to the analysis. Therefore, I will reply to this thread.

The first test "Test-Results" show that the heuristic validating whether the 
underlying platform is sufficient for the Jitter RNG has detected no 
insufficiency during 10000 test runs. Check.

The file foldtime.O0 contains test results for the non-optimized binary code 
that is the basis for the Jitter RNG. To understand what it shows, we have to 
understand what the Jitter RNG really does: it simply measures the execution 
time of a fixed code fragment. The test does the same, i.e. it measures what 
the Jitter RNG would measure. Each time delta is simply recorded.

Each time delta is expected to contribute entropy to the entropy pool. But how 
much? We can use the SP800-90B tool set provided by NIST at [1]. This tool, 
however, can only process input data with a window size of a few bits at most. 
Thus, we take the 4 LSB of each time delta, hoping that they contain already 
sufficient entropy. 

Using the tool [1], we get the following output:

Running non-IID tests...

Running Most Common Value Estimate...
Bitstring MCV Estimate: mode = 2005620, p-hat = 0.50140499999999999, p_u = 
0.50204895486400081
        Most Common Value Estimate (bit string) = 0.994100 / 1 bit(s)

Running Entropic Statistic Estimates (bit strings only)...
Bitstring Collision Estimate: X-bar = 2.5010973564651491, sigma-hat = 
0.49999895212561996, p = 0.5
        Collision Test Estimate (bit string) = 1.000000 / 1 bit(s)
Bitstring Markov Estimate: P_0 = 0.50140499999999999, P_1 = 
0.49859500000000001, P_0,0 = 0.50032309211116766, P_0,1 = 0.49967690788883234, 
P_1,0 = 0.50249325729964067, P_1,1 = 0.49750674270035933, p_max = 
3.86818991019963e-39
        Markov Test Estimate (bit string) = 0.996903 / 1 bit(s)
Bitstring Compression Estimate: X-bar = 5.2170320393664023, sigma-hat = 
1.0146785561878935, p = 0.025847044943319686
        Compression Test Estimate (bit string) = 0.878976 / 1 bit(s)

Running Tuple Estimates...
Bitstring t-Tuple Estimate: t = 18, p-hat_max = 0.52360109960331436, p_u = 
0.52424433922577907
Bitstring LRS Estimate: u = 19, v = 42, p-hat = 0.50001215824001477, p_u = 
0.50065611564620627
        T-Tuple Test Estimate (bit string) = 0.931689 / 1 bit(s)
        LRS Test Estimate (bit string) = 0.998108 / 1 bit(s)

Running Predictor Estimates...
Bitstring MultiMCW Prediction Estimate: N = 3999937, Pglobal' = 
0.50046008453798463 (C = 1999233) Plocal can't affect result (r = 24)
        Multi Most Common in Window (MultiMCW) Prediction Test Estimate (bit 
string) = 0.998673 / 1 bit(s)
Bitstring Lag Prediction Estimate: N = 3999999, Pglobal' = 0.50117058226135014 
(C = 2002106) Plocal can't affect result (r = 22)
        Lag Prediction Test Estimate (bit string) = 0.996626 / 1 bit(s)
Bitstring MultiMMC Prediction Estimate: N = 3999998, Pglobal' = 
0.50240995443366221 (C = 2007063) Plocal can't affect result (r = 21)
        Multi Markov Model with Counting (MultiMMC) Prediction Test Estimate 
(bit string) = 0.993063 / 1 bit(s)
Bitstring LZ78Y Prediction Estimate: N = 3999983, Pglobal' = 
0.50195008712868949 (C = 2005216) Plocal can't affect result (r = 24)
        LZ78Y Prediction Test Estimate (bit string) = 0.994384 / 1 bit(s)

h': 0.878976


- as we analyzed 4 bits of each time delta, we get 4 * 0.878976 = 3.515904 
bits of entropy per four bit time delta

- assuming the worst case that all other bits in the time delta have no 
entropy, we have 3.515904 bits of entropy per time delta

- the Jitter RNG gathers 64 time deltas for returning 64 bits of random data 
and it uses an LFSR with a primitive and irreducible polynomial which is 
entropy preserving. Thus, the Jitter RNG collected 64 * 3.515904 = 225.017856 
bits of entropy for its 64 bit output.

- as the Jitter RNG maintains a 64 bit entropy pool, its entropy content 
cannot be larger than the pool itself. Thus, the entropy content in the pool 
after collecting 64 time deltas is max(64 bits, 225.017856) = 64 bits

This implies that the Jitter RNG data has (close to) 64 bits of entropy per 
data bit.

Bottom line: When the Jitter RNG injects 64 bits of data into the Linux /dev/
random via the IOCTL, it is appropriate that the entropy estimator increases 
by 64 bits.

Bottom line: From my perspective, I see no issue in using the Jitter RNG as a 
noise source in your environments.


Note, applying the Shannon-Entropy formula to the data, we will get much 
higher entropy values.

Note II: This assessment complies with the entropy assessments to be done for 
a NIST FIP 140-2 validation compliant to FIPS 140-2 IG 7.15 

[1] https://github.com/usnistgov/SP800-90B_EntropyAssessment



Ciao
Stephan



_______________________________________________
openwrt-devel mailing list
openwrt-devel at lists.openwrt.org
https://lists.openwrt.org/mailman/listinfo/openwrt-devel


More information about the openwrt-devel mailing list