There is a supposed NIST tool called SP800-90B_EntropyAssessment on that Git thing. It's designed to be used in accordance with NIST's Recommendation for the Entropy Sources Used for Random Bit Generation. Essentially the tools measure entropy in datasets from entropic data sources. It's meant to output 0 - 8 bits/byte, (but be warned that the Python tests take many hours to complete).
I don't know if that Git repository is officially sanctioned by the US Department of Commerce, or if these tools have been written by hobbyists during their lunch breaks.
The C++ IID tool doesn't compile at all (
bzlib.h: No such file or directory)The C++ non-IID tool gives the included 1.2MB $ \pi $ sequence an entropy of 0.08 bits/byte, which is based on a
Compression Test Estimate. That's a tenth of what I measure it at, and it's hard to reconcile with the scientific consensus that the constant is perfectly random. The expected value should be ~1 bit/byte give or take the min-entropy variance.It's not clear as to whether their $ \pi $ sequence is IID or non-IID as it's an odd file format and the definition of IID is vague regarding the degrees of a Markov sequence.
For some combinations of test and data source, you can get entropy > 8 bits/byte which infers a very good entropy generator or poorly validated code (definitely missing
assertstatements).
Does anyone have experience of this tool? I've never read any TRNG paper that's ever mentioned using it to estimate their entropy. And I've read a lot of them. Perhaps I've misunderstood what the tool does. Can this tool be relied upon to measure entropy correctly and securely?