I saw in another thread: Is using 7-8 random words from all words of a language as password a good idea?
These calculations:
If we assume that English has 171,476 words. Then with 8 words the entropy is:
$$P(171476,8) \approx 7.474 \times 10^{41} \approx 2^{140}$$
Therefore you will have lower entropy than Bip-39, again. And with 7 words:
$$P(171476,7)≈4.358×10^{36}≈2^{122}$$
I like to use this calculation method myself. I have the following related questions:
What is the name of this mathematical formula called $P(x,y)$?
Any online calculator available? Or even better, a formula for MS Excel or python?
Is this method commonly used to calculate entropy? If not please hint me to a more widely used formula.
pow. It does not return the password entropy, but the number of distinct passwords. The password entropy in bit for uniform choice among these is the base-2 logarithm of the number of password, in other words $\log_2(x^y)=y\log_2(x)=y\log(x)/\log(2)$. In many languages $\log_2$ is calledlog2and $\log$ is calledlog(but in otherslogcould also be $\log_{10}$). – fgrieu Jan 15 '20 at 16:06