Maybe this will be considered as a total newbie question, but I don't care : what is exactly "bits of entropy" ? Is it the numbers of caracters, letters, and numbers, with all unique coefficeient, or something else ?
Entropy is a measurement how "unexpected" a certain information is. E.g. the information "the vulcano did not break out today" has very low entropy, because its a common event. The information "the vulcano did break out today" on the other hand is a rare event and as such has a high entropy. Its measured in bits, see more here[1]
For passwords its simple, you just calculate the number of possible outcomes and take the result log2. E.g. if you have a 16 symbol password made up from 62 different (e.g. [a-zA-Z0-9]) symbols you calculate:
log2((2*26+10)16) = 95 bitIf your caluclator does not support log
2 (most dont) you can do log(62
16)/log(2) instead to get the same result.
If you use words or other things as symbols you have to replace the 62 by the number of words or other things you selected them from. If you e.g. randomly selected the names of two friends and you have 100 people you would call such the entropy would be
log2(1002) = 13 bit
[1] https://en.wikipedia.org/wiki/Entropy_(information_theory)